Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
41 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
41 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Inference Strategies for Machine Translation with Conditional Masking (2010.02352v2)

Published 5 Oct 2020 in cs.CL

Abstract: Conditional masked LLM (CMLM) training has proven successful for non-autoregressive and semi-autoregressive sequence generation tasks, such as machine translation. Given a trained CMLM, however, it is not clear what the best inference strategy is. We formulate masked inference as a factorization of conditional probabilities of partial sequences, show that this does not harm performance, and investigate a number of simple heuristics motivated by this perspective. We identify a thresholding strategy that has advantages over the standard "mask-predict" algorithm, and provide analyses of its behavior on machine translation tasks.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Julia Kreutzer (44 papers)
  2. George Foster (24 papers)
  3. Colin Cherry (38 papers)
Citations (5)