Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
102 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Consensus Attention-based Neural Networks for Chinese Reading Comprehension (1607.02250v3)

Published 8 Jul 2016 in cs.CL and cs.NE

Abstract: Reading comprehension has embraced a booming in recent NLP research. Several institutes have released the Cloze-style reading comprehension data, and these have greatly accelerated the research of machine comprehension. In this work, we firstly present Chinese reading comprehension datasets, which consist of People Daily news dataset and Children's Fairy Tale (CFT) dataset. Also, we propose a consensus attention-based neural network architecture to tackle the Cloze-style reading comprehension problem, which aims to induce a consensus attention over every words in the query. Experimental results show that the proposed neural network significantly outperforms the state-of-the-art baselines in several public datasets. Furthermore, we setup a baseline for Chinese reading comprehension task, and hopefully this would speed up the process for future research.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Yiming Cui (80 papers)
  2. Ting Liu (329 papers)
  3. Zhipeng Chen (46 papers)
  4. Shijin Wang (69 papers)
  5. Guoping Hu (39 papers)
Citations (87)