$\varepsilon$ KÚ <MASK>: Integrating Yorùbá cultural greetings into machine translation (2303.17972v2)
Abstract: This paper investigates the performance of massively multilingual neural machine translation (NMT) systems in translating Yor`ub\'a greetings ($\varepsilon$ k\'u [MASK]), which are a big part of Yor`ub\'a language and culture, into English. To evaluate these models, we present IkiniYor`ub\'a, a Yor`ub\'a-English translation dataset containing some Yor`ub\'a greetings, and sample use cases. We analysed the performance of different multilingual NMT systems including Google and NLLB and show that these models struggle to accurately translate Yor`ub\'a greetings into English. In addition, we trained a Yor`ub\'a-English model by finetuning an existing NMT model on the training split of IkiniYor`ub\'a and this achieved better performance when compared to the pre-trained multilingual NMT models, although they were trained on a large volume of data.
- Idris Akinade (3 papers)
- Jesujoba Alabi (11 papers)
- David Adelani (7 papers)
- Clement Odoje (1 paper)
- Dietrich Klakow (114 papers)