Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
129 tokens/sec
GPT-4o
28 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Low-resource keyword spotting using contrastively trained transformer acoustic word embeddings (2506.17690v1)

Published 21 Jun 2025 in eess.AS

Abstract: We introduce a new approach, the ContrastiveTransformer, that produces acoustic word embeddings (AWEs) for the purpose of very low-resource keyword spotting. The ContrastiveTransformer, an encoder-only model, directly optimises the embedding space using normalised temperature-scaled cross entropy (NT-Xent) loss. We use this model to perform keyword spotting for radio broadcasts in Luganda and Bambara, the latter a severely under-resourced language. We compare our model to various existing AWE approaches, including those constructed from large pre-trained self-supervised models, a recurrent encoder which previously used the NT-Xent loss, and a DTW baseline. We demonstrate that the proposed contrastive transformer approach offers performance improvements over all considered existing approaches to very low-resource keyword spotting in both languages.

Summary

We haven't generated a summary for this paper yet.