Papers
Topics
Authors
Recent
Search
2000 character limit reached

Multilingual Multi-Domain Adaptation Approaches for Neural Machine Translation

Published 19 Jun 2019 in cs.CL | (1906.07978v2)

Abstract: In this paper, we propose two novel methods for domain adaptation for the attention-only neural machine translation (NMT) model, i.e., the Transformer. Our methods focus on training a single translation model for multiple domains by either learning domain specialized hidden state representations or predictor biases for each domain. We combine our methods with a previously proposed black-box method called mixed fine tuning, which is known to be highly effective for domain adaptation. In addition, we incorporate multilingualism into the domain adaptation framework. Experiments show that multilingual multi-domain adaptation can significantly improve both resource-poor in-domain and resource-rich out-of-domain translations, and the combination of our methods with mixed fine tuning achieves the best performance.

Citations (22)

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (2)

Collections

Sign up for free to add this paper to one or more collections.