Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Attention-Based Convolutional Neural Network for Machine Comprehension (1602.04341v1)

Published 13 Feb 2016 in cs.CL

Abstract: Understanding open-domain text is one of the primary challenges in NLP. Machine comprehension benchmarks evaluate the system's ability to understand text based on the text content only. In this work, we investigate machine comprehension on MCTest, a question answering (QA) benchmark. Prior work is mainly based on feature engineering approaches. We come up with a neural network framework, named hierarchical attention-based convolutional neural network (HABCNN), to address this task without any manually designed features. Specifically, we explore HABCNN for this task by two routes, one is through traditional joint modeling of passage, question and answer, one is through textual entailment. HABCNN employs an attention mechanism to detect key phrases, key sentences and key snippets that are relevant to answering the question. Experiments show that HABCNN outperforms prior deep learning approaches by a big margin.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Wenpeng Yin (69 papers)
  2. Sebastian Ebert (6 papers)
  3. Hinrich Schütze (250 papers)
Citations (98)

Summary

We haven't generated a summary for this paper yet.