Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

The Neural Noisy Channel (1611.02554v2)

Published 8 Nov 2016 in cs.CL, cs.AI, and cs.NE

Abstract: We formulate sequence to sequence transduction as a noisy channel decoding problem and use recurrent neural networks to parameterise the source and channel models. Unlike direct models which can suffer from explaining-away effects during training, noisy channel models must produce outputs that explain their inputs, and their component models can be trained with not only paired training samples but also unpaired samples from the marginal output distribution. Using a latent variable to control how much of the conditioning sequence the channel model needs to read in order to generate a subsequent symbol, we obtain a tractable and effective beam search decoder. Experimental results on abstractive sentence summarisation, morphological inflection, and machine translation show that noisy channel models outperform direct models, and that they significantly benefit from increased amounts of unpaired output data that direct models cannot easily use.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Lei Yu (234 papers)
  2. Phil Blunsom (87 papers)
  3. Chris Dyer (91 papers)
  4. Edward Grefenstette (66 papers)
  5. Tomas Kocisky (20 papers)
Citations (67)

Summary

We haven't generated a summary for this paper yet.