Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Unveiling the Power of Source: Source-based Minimum Bayes Risk Decoding for Neural Machine Translation (2406.11632v3)

Published 17 Jun 2024 in cs.CL and cs.AI

Abstract: Maximum a posteriori decoding, a commonly used method for neural machine translation (NMT), aims to maximize the estimated posterior probability. However, high estimated probability does not always lead to high translation quality. Minimum Bayes Risk (MBR) decoding (\citealp{kumar2004minimum}) offers an alternative by seeking hypotheses with the highest expected utility. Inspired by Quality Estimation (QE) reranking which uses the QE model as a ranker (\citealp{fernandes-etal-2022-quality}), we propose source-based MBR (sMBR) decoding, a novel approach that utilizes quasi-sources (generated via paraphrasing or back-translation) as ``support hypotheses'' and a reference-free quality estimation metric as the utility function, marking the first work to solely use sources in MBR decoding. Experiments show that sMBR outperforms QE reranking and the standard MBR decoding. Our findings suggest that sMBR is a promising approach for NMT decoding.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Boxuan Lyu (1 paper)
  2. Hidetaka Kamigaito (62 papers)
  3. Kotaro Funakoshi (8 papers)
  4. Manabu Okumura (41 papers)

Summary

We haven't generated a summary for this paper yet.