Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Towards Robust k-Nearest-Neighbor Machine Translation (2210.08808v1)

Published 17 Oct 2022 in cs.CL

Abstract: k-Nearest-Neighbor Machine Translation (kNN-MT) becomes an important research direction of NMT in recent years. Its main idea is to retrieve useful key-value pairs from an additional datastore to modify translations without updating the NMT model. However, the underlying retrieved noisy pairs will dramatically deteriorate the model performance. In this paper, we conduct a preliminary study and find that this problem results from not fully exploiting the prediction of the NMT model. To alleviate the impact of noise, we propose a confidence-enhanced kNN-MT model with robust training. Concretely, we introduce the NMT confidence to refine the modeling of two important components of kNN-MT: kNN distribution and the interpolation weight. Meanwhile we inject two types of perturbations into the retrieved pairs for robust training. Experimental results on four benchmark datasets demonstrate that our model not only achieves significant improvements over current kNN-MT models, but also exhibits better robustness. Our code is available at https://github.com/DeepLearnXMU/Robust-knn-mt.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (7)
  1. Hui Jiang (99 papers)
  2. Ziyao Lu (5 papers)
  3. Fandong Meng (174 papers)
  4. Chulun Zhou (13 papers)
  5. Jie Zhou (687 papers)
  6. Degen Huang (8 papers)
  7. Jinsong Su (96 papers)
Citations (18)

Summary

We haven't generated a summary for this paper yet.