Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Is Graph Structure Necessary for Multi-hop Question Answering? (2004.03096v2)

Published 7 Apr 2020 in cs.CL

Abstract: Recently, attempting to model texts as graph structure and introducing graph neural networks to deal with it has become a trend in many NLP research areas. In this paper, we investigate whether the graph structure is necessary for multi-hop question answering. Our analysis is centered on HotpotQA. We construct a strong baseline model to establish that, with the proper use of pre-trained models, graph structure may not be necessary for multi-hop question answering. We point out that both graph structure and adjacency matrix are task-related prior knowledge, and graph-attention can be considered as a special case of self-attention. Experiments and visualized analysis demonstrate that graph-attention or the entire graph structure can be replaced by self-attention or Transformers.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Nan Shao (6 papers)
  2. Yiming Cui (80 papers)
  3. Ting Liu (329 papers)
  4. Shijin Wang (69 papers)
  5. Guoping Hu (39 papers)
Citations (16)

Summary

We haven't generated a summary for this paper yet.