Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
162 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Comparison of linear and nonlinear methods for decoding selective attention to speech from ear-EEG recordings (2401.05187v2)

Published 10 Jan 2024 in eess.AS

Abstract: Many people with hearing loss struggle to comprehend speech in crowded auditory scenes, even when they are using hearing aids. It has recently been demonstrated that the focus of a listener's selective attention to speech can be decoded from their electroencephalography (EEG) recordings, raising the prospect of smart EEG-steered hearing aids which restore speech comprehension in adverse acoustic environments (such as the cocktail party). To this end, we here assess the feasibility of using a novel, ultra-wearable ear-EEG device to classify the selective attention of normal-hearing listeners who participated in a two-talker competing-speakers experiment. Eighteen participants took part in a diotic listening task, whereby they were asked to attend to one narrator whilst ignoring the other. Encoding models were estimated from the recorded signals, and these confirmed that the device has the ability to capture auditory responses that are consistent with those reported in high-density EEG studies. Several state-of-the-art auditory attention decoding algorithms were next compared, including stimulus-reconstruction algorithms based on linear regression as well as non-linear deep neural networks, and canonical correlation analysis (CCA). Meaningful markers of selective auditory attention could be extracted from the ear-EEG signals of all 18 participants, even when those markers were derived from relatively short EEG segments of just five seconds in duration. Algorithms which related the EEG signals to the rising edges of the speech temporal envelope (onset envelope) were more successful than those which made use of the temporal envelope itself. The CCA algorithm achieved the highest mean attention decoding accuracy, although differences between the performances of the three algorithms were both small and not statistically significant when EEG segments of short durations were employed.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (43)
  1. Nicholas A. Lesica, “Why do hearing aids fail to restore normal auditory perception?,” Trends in Neurosciences, vol. 41, no. 4, pp. 174–185, Apr. 2018.
  2. “Why do people fitted with hearing aids not wear them?,” International Journal of Audiology, vol. 52, no. 5, pp. 360–368, Mar. 2013.
  3. “Digital noise reduction: An overview,” Trends in Amplification, vol. 10, no. 2, pp. 67–82, June 2006.
  4. “Telecoils: Principles, pitfalls, fixes, and the future,” Seminars in Hearing, vol. 24, no. 1, pp. 029–042, 2003.
  5. “Acoustic beamforming for hearing aid applications,” Handbook on Array Processing and Sensor Networks, pp. 269–302, 2010.
  6. “Speech recognition in noise using bilateral open-fit hearing aids: The limited benefit of directional microphones and noise reduction,” International Journal of Audiology, vol. 52, no. 1, pp. 29–36, Aug. 2012.
  7. “Relationship between laboratory measures of directional advantage and everyday success with directional microphone hearing aids,” Journal of the American Academy of Audiology, vol. 15, no. 05, pp. 353–364, May 2004.
  8. “Head angle and elevation in classroom environments: Implications for amplification,” Journal of Speech, Language, and Hearing Research, vol. 51, no. 2, pp. 516–525, Apr. 2008.
  9. “The effect of hearing aid microphone mode on performance in an auditory orienting task,” Ear & Hearing, vol. 35, no. 5, pp. e204–e212, Sept. 2014.
  10. “The minimum monitoring signal-to-noise ratio for off-axis signals and its implications for directional hearing aids,” Hearing Research, vol. 357, pp. 64–72, Jan. 2018.
  11. “A Yarbus-style experiment to determine auditory attention,” in Proceedings of the 2010 Annual International Conference of the IEEE Engineering in Medicine and Biology. IEEE, 2010, pp. 4650–4653.
  12. “Towards estimating selective auditory attention from eeg using a novel time-frequency-synchronisation framework,” in Proceedings of the 2010 International Joint Conference on Neural Networks (IJCNN), 2010, pp. 1–5.
  13. “Attentional selection in a cocktail party environment can be decoded from single-trial EEG,” Cerebral Cortex, vol. 25, no. 7, pp. 1697–1706, Jan. 2014.
  14. “Electroencephalography-based auditory attention decoding: Toward neurosteered hearing devices,” IEEE Signal Processing Magazine, vol. 38, no. 4, pp. 89–102, 2021.
  15. “Identifying auditory attention with ear-EEG: cEEGrid versus high-density cap-EEG comparison,” Journal of Neural Engineering, vol. 13, no. 6, pp. 066004, Oct. 2016.
  16. “Single-channel in-ear-EEG detects the focus of auditory attention to concurrent tone streams and mixed speech,” Journal of Neural Engineering, vol. 14, no. 3, pp. 036020, Apr. 2017.
  17. “A comparison of regularization methods in forward and backward models for auditory attention decoding,” Frontiers in Neuroscience, vol. 12, Aug. 2018.
  18. Octave Etard et al., “Decoding of selective attention to continuous speech from the human auditory brainstem response,” NeuroImage, vol. 200, pp. 1–11, 2019.
  19. “Cortical entrainment to continuous speech: Functional roles and interpretations,” Frontiers in Human Neuroscience, vol. 8, May 2014.
  20. “Neural responses to uninterrupted natural speech can be extracted with precise temporal resolution,” European Journal of Neuroscience, vol. 31, no. 1, pp. 189–193, 2010.
  21. “Inference of the selective auditory attention using sequential lmmse estimation,” IEEE Transactions on Biomedical Engineering, vol. 68, no. 12, pp. 3501–3512, 2021.
  22. “The in-the-ear recording concept: User-centered and wearable brain monitoring,” IEEE Pulse, vol. 3, no. 6, pp. 32–42, 2012.
  23. “Unobtrusive ambulatory EEG using a smartphone and flexible printed electrodes around the ear,” Scientific Reports, vol. 5, no. 1, Nov. 2015.
  24. “Conformal in-ear bioelectronics for visual and auditory brain-computer interfaces,” Nature Communications, vol. 14, no. 1, July 2023.
  25. “Biosignal sensing device using dynamic selection of electrodes,” July 20 2023, US Patent US20230225659A1 (Application).
  26. “An in-the-ear platform for recording electroencephalogram,” in Proceedings of the 2011 Annual International Conference of the IEEE Engineering in Medicine and Biology Society, 2011, pp. 6882–6885.
  27. “Ear-EEG: Continuous brain monitoring,” in Springer Briefs in Electrical and Computer Engineering, pp. 63–71. Springer International Publishing, 2014.
  28. “A hearing aid adapted for detection brain waves and a method for adapting such a hearing aid,” May 5 2015, US Patent US9025800B2 (granted).
  29. “Acquisition of subcortical auditory potentials with around-the-ear cEEGrid technology in normal and hearing impaired listeners,” Frontiers in Neuroscience, vol. 13, July 2019.
  30. “Ear-EEG measures of auditory attention to continuous speech,” Frontiers in Neuroscience, vol. 16, pp. 539, 2022.
  31. “The sensitivity of ear-EEG: Evaluating the source-sensor relationship using forward modeling,” Brain topography, vol. 33, pp. 665–676, 2020.
  32. “Ear-EEG sensitivity modeling for neural sources and ocular artifacts,” Frontiers in Neuroscience, vol. 16, Jan. 2023.
  33. Ingo Hertrich et al., “Magnetic brain activity phase-locked to the envelope, the syllable onsets, and the fundamental frequency of a perceived speech signal,” Psychophysiology, vol. 49, no. 3, pp. 322–334, 2011.
  34. “Hearables: Artefact removal in ear-EEG for continuous 24/7 monitoring,” in Proceedings of the 2022 International Joint Conference on Neural Networks (IJCNN), 2022, pp. 1–6.
  35. “Hearables: Making sense from motion artefacts in ear-EEG for real-life human activity classification,” in Proceedings of the 2021 43rd Annual International Conference of the IEEE Engineering in Medicine & Biology Society (EMBC), 2021, pp. 6889–6893.
  36. Wouter Biesmans et al., “Auditory-inspired speech envelope extraction methods for improved EEG-based auditory attention detection in a cocktail party scenario,” IEEE Transactions on Neural Systems and Rehabilitation Engineering, vol. 25, no. 5, pp. 402–412, 2017.
  37. “A tutorial on auditory attention identification methods,” Frontiers in Neuroscience, vol. 13, Mar. 2019.
  38. “On the interpretation of weight vectors of linear models in multivariate neuroimaging,” NeuroImage, vol. 87, pp. 96–110, Feb. 2014.
  39. “Dynamic estimation of auditory temporal response functions via state-space models with gaussian mixture process noise,” PLOS Computational Biology, vol. 16, no. 8, pp. e1008172, Aug. 2020.
  40. “An interpretable performance metric for auditory attention decoding algorithms in a context of neuro-steered gain control,” IEEE Transactions on Neural Systems and Rehabilitation Engineering, vol. 28, no. 1, pp. 307–317, 2020.
  41. “Comparison of two-talker attention decoding from EEG with nonlinear neural networks and linear methods,” Scientific Reports, vol. 9, no. 1, Aug. 2019.
  42. “Machine learning for decoding listeners’ attention from electroencephalography evoked by continuous speech,” European Journal of Neuroscience, vol. 51, no. 5, pp. 1234–1241, Jan. 2018.
  43. “Robust decoding of the speech envelope from EEG recordings through deep neural networks,” Journal of Neural Engineering, vol. 19, no. 4, pp. 046007, July 2022.

Summary

We haven't generated a summary for this paper yet.