Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Improving Multilingual Sentence Embedding using Bi-directional Dual Encoder with Additive Margin Softmax (1902.08564v2)

Published 22 Feb 2019 in cs.CL

Abstract: In this paper, we present an approach to learn multilingual sentence embeddings using a bi-directional dual-encoder with additive margin softmax. The embeddings are able to achieve state-of-the-art results on the United Nations (UN) parallel corpus retrieval task. In all the languages tested, the system achieves P@1 of 86% or higher. We use pairs retrieved by our approach to train NMT models that achieve similar performance to models trained on gold pairs. We explore simple document-level embeddings constructed by averaging our sentence embeddings. On the UN document-level retrieval task, document embeddings achieve around 97% on P@1 for all experimented language pairs. Lastly, we evaluate the proposed model on the BUCC mining task. The learned embeddings with raw cosine similarity scores achieve competitive results compared to current state-of-the-art models, and with a second-stage scorer we achieve a new state-of-the-art level on this task.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (9)
  1. Yinfei Yang (73 papers)
  2. Gustavo Hernandez Abrego (12 papers)
  3. Steve Yuan (5 papers)
  4. Mandy Guo (21 papers)
  5. Qinlan Shen (6 papers)
  6. Daniel Cer (28 papers)
  7. Brian Strope (11 papers)
  8. Ray Kurzweil (11 papers)
  9. Yun-Hsuan Sung (18 papers)
Citations (112)

Summary

We haven't generated a summary for this paper yet.