Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Decoding High-level Imagined Speech using Attention-based Deep Neural Networks (2112.06922v1)

Published 13 Dec 2021 in cs.HC, cs.SD, and eess.AS

Abstract: Brain-computer interface (BCI) is the technology that enables the communication between humans and devices by reflecting status and intentions of humans. When conducting imagined speech, the users imagine the pronunciation as if actually speaking. In the case of decoding imagined speech-based EEG signals, complex task can be conducted more intuitively, but decoding performance is lower than that of other BCI paradigms. We modified our previous model for decoding imagined speech-based EEG signals. Ten subjects participated in the experiment. The average accuracy of our proposed method was 0.5648 for classifying four words. In other words, our proposed method has significant strength in learning local features. Hence, we demonstrated the feasibility of decoding imagined speech-based EEG signals with robust performance.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Dae-Hyeok Lee (16 papers)
  2. Sung-Jin Kim (20 papers)
  3. Keon-Woo Lee (2 papers)
Citations (4)

Summary

We haven't generated a summary for this paper yet.