Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
102 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Toward a full-scale neural machine translation in production: the Booking.com use case (1709.05820v2)

Published 18 Sep 2017 in cs.CL

Abstract: While some remarkable progress has been made in neural machine translation (NMT) research, there have not been many reports on its development and evaluation in practice. This paper tries to fill this gap by presenting some of our findings from building an in-house travel domain NMT system in a large scale E-commerce setting. The three major topics that we cover are optimization and training (including different optimization strategies and corpus sizes), handling real-world content and evaluating results.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Pavel Levin (4 papers)
  2. Nishikant Dhanuka (2 papers)
  3. Talaat Khalil (2 papers)
  4. Fedor Kovalev (3 papers)
  5. Maxim Khalilov (2 papers)
Citations (17)