Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

A Review of Bangla Natural Language Processing Tasks and the Utility of Transformer Models (2107.03844v3)

Published 8 Jul 2021 in cs.CL, cs.AI, cs.IR, and cs.LG

Abstract: Bangla -- ranked as the 6th most widely spoken language across the world (https://www.ethnologue.com/guides/ethnologue200), with 230 million native speakers -- is still considered as a low-resource language in the NLP community. With three decades of research, Bangla NLP (BNLP) is still lagging behind mainly due to the scarcity of resources and the challenges that come with it. There is sparse work in different areas of BNLP; however, a thorough survey reporting previous work and recent advances is yet to be done. In this study, we first provide a review of Bangla NLP tasks, resources, and tools available to the research community; we benchmark datasets collected from various platforms for nine NLP tasks using current state-of-the-art algorithms (i.e., transformer-based models). We provide comparative results for the studied NLP tasks by comparing monolingual vs. multilingual models of varying sizes. We report our results using both individual and consolidated datasets and provide data splits for future research. We reviewed a total of 108 papers and conducted 175 sets of experiments. Our results show promising performance using transformer-based models while highlighting the trade-off with computational costs. We hope that such a comprehensive survey will motivate the community to build on and further advance the research on Bangla NLP.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (7)
  1. Firoj Alam (75 papers)
  2. Arid Hasan (1 paper)
  3. Tanvirul Alam (5 papers)
  4. Akib Khan (3 papers)
  5. Janntatul Tajrin (1 paper)
  6. Naira Khan (1 paper)
  7. Shammur Absar Chowdhury (31 papers)
Citations (22)

Summary

We haven't generated a summary for this paper yet.