Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
102 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Towards Building ASR Systems for the Next Billion Users (2111.03945v3)

Published 6 Nov 2021 in cs.CL, cs.SD, and eess.AS

Abstract: Recent methods in speech and language technology pretrain very LARGE models which are fine-tuned for specific tasks. However, the benefits of such LARGE models are often limited to a few resource rich languages of the world. In this work, we make multiple contributions towards building ASR systems for low resource languages from the Indian subcontinent. First, we curate 17,000 hours of raw speech data for 40 Indian languages from a wide variety of domains including education, news, technology, and finance. Second, using this raw speech data we pretrain several variants of wav2vec style models for 40 Indian languages. Third, we analyze the pretrained models to find key features: codebook vectors of similar sounding phonemes are shared across languages, representations across layers are discriminative of the language family, and attention heads often pay attention within small local windows. Fourth, we fine-tune this model for downstream ASR for 9 languages and obtain state-of-the-art results on 3 public datasets, including on very low-resource languages such as Sinhala and Nepali. Our work establishes that multilingual pretraining is an effective strategy for building ASR systems for the linguistically diverse speakers of the Indian subcontinent. Our code, data and models are available publicly at https://indicnlp.ai4bharat.org/indicwav2vec/ and we hope they will help advance research in ASR for Indic languages.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (8)
  1. Tahir Javed (9 papers)
  2. Sumanth Doddapaneni (16 papers)
  3. Abhigyan Raman (5 papers)
  4. Kaushal Santosh Bhogale (6 papers)
  5. Gowtham Ramesh (6 papers)
  6. Anoop Kunchukuttan (45 papers)
  7. Pratyush Kumar (44 papers)
  8. Mitesh M. Khapra (79 papers)
Citations (50)

Summary

We haven't generated a summary for this paper yet.