Dynamic Fusion: Attentional Language Model for Neural Machine Translation (1909.04879v1)
Abstract: Neural Machine Translation (NMT) can be used to generate fluent output. As such, LLMs have been investigated for incorporation with NMT. In prior investigations, two models have been used: a translation model and a LLM. The translation model's predictions are weighted by the LLM with a hand-crafted ratio in advance. However, these approaches fail to adopt the LLM weighting with regard to the translation history. In another line of approach, LLM prediction is incorporated into the translation model by jointly considering source and target information. However, this line of approach is limited because it largely ignores the adequacy of the translation output. Accordingly, this work employs two mechanisms, the translation model and the LLM, with an attentive architecture to the LLM as an auxiliary element of the translation model. Compared with previous work in English--Japanese machine translation using a LLM, the experimental results obtained with the proposed Dynamic Fusion mechanism improve BLEU and Rank-based Intuitive Bilingual Evaluation Scores (RIBES) scores. Additionally, in the analyses of the attention and predictivity of the LLM, the Dynamic Fusion mechanism allows predictive LLMing that conforms to the appropriate grammatical structure.