Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Aligning Very Small Parallel Corpora Using Cross-Lingual Word Embeddings and a Monogamy Objective (1811.00066v1)

Published 31 Oct 2018 in cs.CL

Abstract: Count-based word alignment methods, such as the IBM models or fast-align, struggle on very small parallel corpora. We therefore present an alternative approach based on cross-lingual word embeddings (CLWEs), which are trained on purely monolingual data. Our main contribution is an unsupervised objective to adapt CLWEs to parallel corpora. In experiments on between 25 and 500 sentences, our method outperforms fast-align. We also show that our fine-tuning objective consistently improves a CLWE-only baseline.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Nina Poerner (9 papers)
  2. Masoud Jalili Sabet (12 papers)
  3. Benjamin Roth (48 papers)
  4. Hinrich Schütze (250 papers)
Citations (2)

Summary

We haven't generated a summary for this paper yet.