2000 character limit reached
Hierarchical Gated Recurrent Neural Tensor Network for Answer Triggering (1709.05599v1)
Published 17 Sep 2017 in cs.CL
Abstract: In this paper, we focus on the problem of answer triggering ad-dressed by Yang et al. (2015), which is a critical component for a real-world question answering system. We employ a hierarchical gated recurrent neural tensor (HGRNT) model to capture both the context information and the deep in-teractions between the candidate answers and the question. Our result on F val-ue achieves 42.6%, which surpasses the baseline by over 10 %.
Sponsor
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.