Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
41 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
41 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

A Nested Attention Neural Hybrid Model for Grammatical Error Correction (1707.02026v2)

Published 7 Jul 2017 in cs.CL

Abstract: Grammatical error correction (GEC) systems strive to correct both global errors in word order and usage, and local errors in spelling and inflection. Further developing upon recent work on neural machine translation, we propose a new hybrid neural model with nested attention layers for GEC. Experiments show that the new model can effectively correct errors of both types by incorporating word and character-level information,and that the model significantly outperforms previous neural models for GEC as measured on the standard CoNLL-14 benchmark dataset. Further analysis also shows that the superiority of the proposed model can be largely attributed to the use of the nested attention mechanism, which has proven particularly effective in correcting local errors that involve small edits in orthography.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Jianshu Ji (4 papers)
  2. Qinlong Wang (5 papers)
  3. Kristina Toutanova (31 papers)
  4. Yongen Gong (1 paper)
  5. Steven Truong (1 paper)
  6. Jianfeng Gao (344 papers)
Citations (107)