Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
41 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
41 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Attentive fine-tuning of Transformers for Translation of low-resourced languages @LoResMT 2021 (2108.08556v2)

Published 19 Aug 2021 in cs.CL

Abstract: This paper reports the Machine Translation (MT) systems submitted by the IIITT team for the English->Marathi and English->Irish language pairs LoResMT 2021 shared task. The task focuses on getting exceptional translations for rather low-resourced languages like Irish and Marathi. We fine-tune IndicTrans, a pretrained multilingual NMT model for English->Marathi, using external parallel corpus as input for additional training. We have used a pretrained Helsinki-NLP Opus MT English->Irish model for the latter language pair. Our approaches yield relatively promising results on the BLEU metrics. Under the team name IIITT, our systems ranked 1, 1, and 2 in English->Marathi, Irish->English, and English->Irish, respectively.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (7)
  1. Karthik Puranik (5 papers)
  2. Adeep Hande (10 papers)
  3. Ruba Priyadharshini (14 papers)
  4. Thenmozhi Durairaj (2 papers)
  5. Anbukkarasi Sampath (5 papers)
  6. Kingston Pal Thamburaj (4 papers)
  7. Bharathi Raja Chakravarthi (26 papers)
Citations (8)