Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
41 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
41 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Weakly Supervised Grammatical Error Correction using Iterative Decoding (1811.01710v1)

Published 31 Oct 2018 in cs.CL, cs.LG, and stat.ML

Abstract: We describe an approach to Grammatical Error Correction (GEC) that is effective at making use of models trained on large amounts of weakly supervised bitext. We train the Transformer sequence-to-sequence model on 4B tokens of Wikipedia revisions and employ an iterative decoding strategy that is tailored to the loosely-supervised nature of the Wikipedia training corpus. Finetuning on the Lang-8 corpus and ensembling yields an F0.5 of 58.3 on the CoNLL'14 benchmark and a GLEU of 62.4 on JFLEG. The combination of weakly supervised training and iterative decoding obtains an F0.5 of 48.2 on CoNLL'14 even without using any labeled GEC data.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Jared Lichtarge (6 papers)
  2. Christopher Alberti (1 paper)
  3. Shankar Kumar (34 papers)
  4. Noam Shazeer (37 papers)
  5. Niki Parmar (17 papers)
Citations (22)