Papers
Topics
Authors
Recent
2000 character limit reached

End-to-End Language Identification using Multi-Head Self-Attention and 1D Convolutional Neural Networks (2102.00306v1)

Published 30 Jan 2021 in eess.AS

Abstract: In this work, we propose a new approach for language identification using multi-head self-attention combined with raw waveform based 1D convolutional neural networks for Indian languages. Our approach uses an encoder, multi-head selfattention, and a statistics pooling layer. The encoder learns features directly from raw waveforms using 1D convolution kernels and an LSTM layer. The LSTM layer captures temporal information between the features extracted by the 1D convolutional layer. The multi-head self-attention layer takes outputs of the LSTM layer and applies self-attention mechanisms on these features with M different heads. This process helps the model give more weightage to the more useful features and less weightage to the less relevant features. Finally, the frame-level features are combined using a statistics pooling layer to extract the utterance-level feature vector label prediction. We conduct all our experiments on the 373 hrs of audio data for eight different Indian languages. Our experiments show that our approach outperforms the baseline model by an absolute 3.69% improvement in F1-score and achieves the best F1-score of 95.90%. Our approach also shows that using raw waveform models gets a 1.7% improvement in performance compared to the models built using handcrafted features.

Citations (2)

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.