Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Orthogonality Constrained Multi-Head Attention For Keyword Spotting (1910.04500v1)

Published 10 Oct 2019 in cs.LG, eess.AS, and stat.ML

Abstract: Multi-head attention mechanism is capable of learning various representations from sequential data while paying attention to different subsequences, e.g., word-pieces or syllables in a spoken word. From the subsequences, it retrieves richer information than a single-head attention which only summarizes the whole sequence into one context vector. However, a naive use of the multi-head attention does not guarantee such richness as the attention heads may have positional and representational redundancy. In this paper, we propose a regularization technique for multi-head attention mechanism in an end-to-end neural keyword spotting system. Augmenting regularization terms which penalize positional and contextual non-orthogonality between the attention heads encourages to output different representations from separate subsequences, which in turn enables leveraging structured information without explicit sequence models such as hidden Markov models. In addition, intra-head contextual non-orthogonality regularization encourages each attention head to have similar representations across keyword examples, which helps classification by reducing feature variability. The experimental results demonstrate that the proposed regularization technique significantly improves the keyword spotting performance for the keyword "Hey Snapdragon".

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Mingu Lee (16 papers)
  2. Jinkyu Lee (14 papers)
  3. Hye Jin Jang (4 papers)
  4. Byeonggeun Kim (13 papers)
  5. Wonil Chang (3 papers)
  6. Kyuwoong Hwang (9 papers)
Citations (11)

Summary

We haven't generated a summary for this paper yet.