Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
88 tokens/sec
Gemini 2.5 Pro Premium
40 tokens/sec
GPT-5 Medium
20 tokens/sec
GPT-5 High Premium
26 tokens/sec
GPT-4o
90 tokens/sec
DeepSeek R1 via Azure Premium
73 tokens/sec
GPT OSS 120B via Groq Premium
485 tokens/sec
Kimi K2 via Groq Premium
197 tokens/sec
2000 character limit reached

Machine Translation with Large Language Models: Decoder Only vs. Encoder-Decoder (2409.13747v1)

Published 12 Sep 2024 in cs.CL, cs.ET, and cs.LG

Abstract: This project, titled "Machine Translation with LLMs: Decoder-only vs. Encoder-Decoder," aims to develop a multilingual machine translation (MT) model. Focused on Indian regional languages, especially Telugu, Tamil, and Malayalam, the model seeks to enable accurate and contextually appropriate translations across diverse language pairs. By comparing Decoder-only and Encoder-Decoder architectures, the project aims to optimize translation quality and efficiency, advancing cross-linguistic communication tools.The primary objective is to develop a model capable of delivering high-quality translations that are accurate and contextually appropriate. By leveraging LLMs, specifically comparing the effectiveness of Decoder-only and Encoder-Decoder architectures, the project seeks to optimize translation performance and efficiency across multilingual contexts. Through rigorous experimentation and analysis, this project aims to advance the field of machine translation, contributing valuable insights into the effectiveness of different model architectures and paving the way for enhanced cross-linguistic communication tools.

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-up Questions

We haven't generated follow-up questions for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com