Learning Contextualized Knowledge Structures for Commonsense Reasoning (2010.12873v3)
Abstract: Recently, knowledge graph (KG) augmented models have achieved noteworthy success on various commonsense reasoning tasks. However, KG edge (fact) sparsity and noisy edge extraction/generation often hinder models from obtaining useful knowledge to reason over. To address these issues, we propose a new KG-augmented model: Hybrid Graph Network (HGN). Unlike prior methods, HGN learns to jointly contextualize extracted and generated knowledge by reasoning over both within a unified graph structure. Given the task input context and an extracted KG subgraph, HGN is trained to generate embeddings for the subgraph's missing edges to form a "hybrid" graph, then reason over the hybrid graph while filtering out context-irrelevant edges. We demonstrate HGN's effectiveness through considerable performance gains across four commonsense reasoning benchmarks, plus a user study on edge validness and helpfulness.
- Jun Yan (247 papers)
- Mrigank Raman (9 papers)
- Aaron Chan (44 papers)
- Tianyu Zhang (111 papers)
- Ryan Rossi (67 papers)
- Handong Zhao (38 papers)
- Sungchul Kim (65 papers)
- Nedim Lipka (49 papers)
- Xiang Ren (194 papers)