EGNN-C+: Interpretable Evolving Granular Neural Network and Application in Classification of Weakly-Supervised EEG Data Streams (2402.17792v1)
Abstract: We introduce a modified incremental learning algorithm for evolving Granular Neural Network Classifiers (eGNN-C+). We use double-boundary hyper-boxes to represent granules, and customize the adaptation procedures to enhance the robustness of outer boxes for data coverage and noise suppression, while ensuring that inner boxes remain flexible to capture drifts. The classifier evolves from scratch, incorporates new classes on the fly, and performs local incremental feature weighting. As an application, we focus on the classification of emotion-related patterns within electroencephalogram (EEG) signals. Emotion recognition is crucial for enhancing the realism and interactivity of computer systems. We extract features from the Fourier spectrum of EEG signals obtained from 28 individuals engaged in playing computer games -- a public dataset. Each game elicits a different predominant emotion: boredom, calmness, horror, or joy. We analyze individual electrodes, time window lengths, and frequency bands to assess the accuracy and interpretability of resulting user-independent neural models. The findings indicate that both brain hemispheres assist classification, especially electrodes on the temporal (T8) and parietal (P7) areas, alongside contributions from frontal and occipital electrodes. While patterns may manifest in any band, the Alpha (8-13Hz), Delta (1-4Hz), and Theta (4-8Hz) bands, in this order, exhibited higher correspondence with the emotion classes. The eGNN-C+ demonstrates effectiveness in learning EEG data. It achieves an accuracy of 81.7% and a 0.0029 II interpretability using 10-second time windows, even in face of a highly-stochastic time-varying 4-class classification problem.
- X. Gong, C. Chen, and T. Zhang, “Cross-cultural emotion recognition with EEG and eye movement signals based on multiple stacked broad learning system,” IEEE Trans. Comp. Social Systems, 2023.
- M. Jafari et al., “Emotion recognition in EEG signals using deep learning methods: A review,” Computers in Biology and Medicine, vol. 165, 2023.
- T. Song, W. Zheng, P. Song, and Z. Cui, “EEG emotion recognition using dynamical graph convolutional neural networks,” Trans Affect Comp, vol. 11, no. 3, pp. 532–541, 2020.
- M. Alarcao and M. J. Fonseca, “Emotions recognition using EEG signals: A survey,” Trans Affect Comp, vol. 10, pp. 374–393, 2019.
- E. Houssein, A. Hammad, and A. Ali, “Human emotion recognition from EEG-based brain–computer interface using machine learning: a comprehensive review,” Neural Computing and Applications, vol. 34, pp. 12 527–12 557, 2022.
- I. Skrjanc, J. Iglesias, A. Sanchis, D. Leite, E. Lughofer, and F. Gomide, “Evolving fuzzy and neuro-fuzzy approaches in clustering, regression, identification, and classification: A survey.” Inf Sci, vol. 490, pp. 344–368, 2019.
- D. Leite, P. Costa, and F. Gomide, “Evolving granular neural networks from fuzzy data streams,” Neural Networks, vol. 38, pp. 1–16, 2013.
- S. Khare, V. Blanes-Vidal, E. Nadimi, and U. R. Acharya, “Emotion recognition and artificial intelligence: A systematic review (2014-2023) and research recommendations,” Inf. Fusion, vol. 102, 2024.
- D. Leite, V. Frigeri, and R. Medeiros, “Adaptive gaussian fuzzy classifier for real-time emotion recognition in computer games,” in IEEE Latin American Conf. on Computational Intelligence (LA-CCI), 2022, pp. 1–6.
- L. Li, X. Chen, and P. Zhu, “How do e-commerce anchors’ characteristics influence consumers’ impulse buying? An emotional contagion perspective,” J. Retail. Consum. Serv., vol. 76, 2024.
- W. Chango, J. A. Lara, R. Cerezo, and C. Romero, “A review on data fusion in multimodal learning analytics and educational data mining,” WIREs Data Mining Knowl. Discov., vol. 12, no. 4, p. e1458, 2022.
- B. Maiseli, A. Abdalla, L. Massawe, M. Mbise, K. Mkocha, N. Nassor, M. Ismail, J. Michael, and S. Kimambo, “Brain–computer interface: trend, challenges, and threats,” Brain Informatics, vol. 10, 2023.
- K. Kamble and J. Sengupta, “A comprehensive survey on emotion recognition based on electroencephalograph (EEG) signals,” Multimedia Tools and Applications, vol. 82, pp. 27 269–27 304, 2023.
- M. Malaspina, J. A. Barbato, M. Cremaschi, F. Gasparini, A. Grossi, and A. Saibene, “An experimental protocol to access immersiveness in video games,” in Proc. Art. Intell. for Human-Machine Interaction, 2023.
- H. Martinez, Y. Bengio, and G. Yannakakis, “Learning deep physiological models of affect,” Comp Intell Mag, vol. 8, p. 20–33, 2013.
- W.-L. Zheng and B.-L. Lu, “Investigating critical frequency bands and channels for EEG-based emotion recognition with deep neural networks,” IEEE Transactions on Auton Ment Dev, vol. 7, no. 3, pp. 162–175, 2015.
- M. Wu, W. Su, L. Chen, W. Pedrycz, and K. Hirota, “Two-stage fuzzy fusion based-convolution neural network for dynamic emotion recognition,” Trans Affect Comp, pp. 1–1, 2020.
- Y. Li, J. Chan, G. Peko, and D. Sundaram, “An explanation framework and method for AI-based text emotion analysis and visualisation,” Decision Support Systems, vol. 178, p. 114121, 2024.
- Y. Guo, Y. Li, D. Liu, and S. Xu, “Measuring service quality based on customer emotion: An explainable AI approach,” Decision Support Systems, vol. 176, p. 114051, 2024.
- Y. Zhang et al., “Facial emotion recognition based on biorthogonal wavelet entropy, fuzzy support vector machine, and stratified cross validation,” IEEE Access, vol. 4, pp. 8375–8385, 2016.
- G. Lee, M. Kwon, S. Kavuri, and M. Lee, “Emotion recognition based on 3D fuzzy visual and EEG features in movie clips,” Neurocomputing, vol. 144, pp. 560–568, 2014.
- D. Wu, V. J. Lawhern, S. Gordon, B. J. Lance, and C.-T. Lin, “Driver drowsiness estimation from EEG signals using online weighted adaptation regularization for regression (OwARR),” IEEE Tran Fuzzy Syst, vol. 25, no. 6, p. 1522–1535, 2016.
- S.-L. Wu et al., “Fuzzy integral with particle swarm optimization for a motor-imagery-based brain–computer interface,” IEEE Tran Fuzzy Syst, vol. 25, no. 1, p. 21–28, 2016.
- L. Decker, D. Leite, F. Viola, and D. Bonacorsi, “Comparison of evolving granular classifiers applied to anomaly detection for predictive maintenance in computing centers,” in IEEE EAIS, 2020, p. 8p.
- D. Leite, P. Costa, and F. Gomide, “Evolving granular neural network for fuzzy time series forecasting,” in IEEE IJCNN, 2012, p. 8p.
- T. B. Alakus, M. Gonenb, and I. Turkogluc, “Database for an emotion recognition system based on EEG signals and various computer games - gameemo,” Biomed Signal Proces, vol. 60, no. 101951, p. 12p., 2020.
- D. Leite, A. Sharma, C. Demir, and A.-C. Ngomo, “Interpretability index based on balanced volumes for transparent models and agnostic explainers (To Appear),” in IEEE WCCI - FUZZ-IEEE, 2024, pp. 1–10.
- J. Lu, A. Liu, F. Dong, F. Gu, J. Gama, and G. Zhang, “Learning under concept drift: A review,” IEEE Trans. Knowl. Data Eng., vol. 31, no. 12, pp. 2346–2363, 2019.
- M. Mermillod, A. Bugaiska, and P. Bonin, “The stability-plasticity dilemma: investigating the continuum from catastrophic forgetting to age-limited learning effects,” Front Psychol, vol. 4, pp. 1–3, 2013.
- D. Leite and I. Skrjanc, “Ensemble of evolving optimal granular experts, owa aggregation, and time series prediction,” Information Sciences, vol. 504, pp. 95–112, 2019.
- J. Russell, “A circumplex model of affect,” J. Pers. Soc. Psychol., vol. 39, no. 6, pp. 1161–1178, 1980.
- G. Vasiljevic and L. Miranda, “Brain–computer interface games based on consumer-grade EEG devices: A systematic literature review.” Int J Hum-Comput Int, vol. 36, no. 2, pp. 105–142, 2020.
- E. Soares, P. Costa, B. Costa, and D. Leite, “Ensemble of evolving data clouds and fuzzy models for weather time series prediction,” Appl Soft Comput, vol. 64, p. 445–453, 2018.
- W.-L. Zheng, J.-Y. Zhu, and B.-L. Lu, “Identifying stable patterns over time for emotion recognition from EEG,” Trans Affect Comp, vol. 10, no. 3, pp. 417–429, 2019.