2000 character limit reached
Iterative Batch Back-Translation for Neural Machine Translation: A Conceptual Model (2001.11327v2)
Published 26 Nov 2019 in cs.CL
Abstract: An effective method to generate a large number of parallel sentences for training improved neural machine translation (NMT) systems is the use of back-translations of the target-side monolingual data. Recently, iterative back-translation has been shown to outperform standard back-translation albeit on some language pairs. This work proposes the iterative batch back-translation that is aimed at enhancing the standard iterative back-translation and enabling the efficient utilization of more monolingual data. After each iteration, improved back-translations of new sentences are added to the parallel data that will be used to train the final forward model. The work presents a conceptual model of the proposed approach.
- Idris Abdulmumin (39 papers)
- Bashir Shehu Galadanci (7 papers)
- Abubakar Isa (4 papers)