Probabilistic Graph Reasoning for Natural Proof Generation (2107.02418v1)
Abstract: In this paper, we investigate the problem of reasoning over natural language statements. Prior neural based approaches do not explicitly consider the inter-dependency among answers and their proofs. In this paper, we propose PRobr, a novel approach for joint answer prediction and proof generation. PRobr defines a joint probabilistic distribution over all possible proof graphs and answers via an induced graphical model. We then optimize the model using variational approximation on top of neural textual representation. Experiments on multiple datasets under diverse settings (fully supervised, few-shot and zero-shot evaluation) verify the effectiveness of PRobr, e.g., achieving 10%-30% improvement on QA accuracy in few/zero-shot evaluation. Our codes and models can be found at https://github.com/changzhisun/PRobr/.
- Changzhi Sun (18 papers)
- Xinbo Zhang (6 papers)
- Jiangjie Chen (46 papers)
- Chun Gan (6 papers)
- Yuanbin Wu (47 papers)
- Jiaze Chen (17 papers)
- Hao Zhou (351 papers)
- Lei Li (1293 papers)