Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Unsupervised Multilingual Alignment using Wasserstein Barycenter (2002.00743v2)

Published 28 Jan 2020 in cs.CL, cs.LG, and stat.ML

Abstract: We study unsupervised multilingual alignment, the problem of finding word-to-word translations between multiple languages without using any parallel data. One popular strategy is to reduce multilingual alignment to the much simplified bilingual setting, by picking one of the input languages as the pivot language that we transit through. However, it is well-known that transiting through a poorly chosen pivot language (such as English) may severely degrade the translation quality, since the assumed transitive relations among all pairs of languages may not be enforced in the training process. Instead of going through a rather arbitrarily chosen pivot language, we propose to use the Wasserstein barycenter as a more informative "mean" language: it encapsulates information from all languages and minimizes all pairwise transportation costs. We evaluate our method on standard benchmarks and demonstrate state-of-the-art performances.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Xin Lian (5 papers)
  2. Kshitij Jain (11 papers)
  3. Jakub Truszkowski (8 papers)
  4. Pascal Poupart (80 papers)
  5. Yaoliang Yu (59 papers)
Citations (25)

Summary

We haven't generated a summary for this paper yet.