Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

A Human-Centric Metaverse Enabled by Brain-Computer Interface: A Survey (2309.01848v1)

Published 4 Sep 2023 in cs.HC

Abstract: The growing interest in the Metaverse has generated momentum for members of academia and industry to innovate toward realizing the Metaverse world. The Metaverse is a unique, continuous, and shared virtual world where humans embody a digital form within an online platform. Through a digital avatar, Metaverse users should have a perceptual presence within the environment and can interact and control the virtual world around them. Thus, a human-centric design is a crucial element of the Metaverse. The human users are not only the central entity but also the source of multi-sensory data that can be used to enrich the Metaverse ecosystem. In this survey, we study the potential applications of Brain-Computer Interface (BCI) technologies that can enhance the experience of Metaverse users. By directly communicating with the human brain, the most complex organ in the human body, BCI technologies hold the potential for the most intuitive human-machine system operating at the speed of thought. BCI technologies can enable various innovative applications for the Metaverse through this neural pathway, such as user cognitive state monitoring, digital avatar control, virtual interactions, and imagined speech communications. This survey first outlines the fundamental background of the Metaverse and BCI technologies. We then discuss the current challenges of the Metaverse that can potentially be addressed by BCI, such as motion sickness when users experience virtual environments or the negative emotional states of users in immersive virtual applications. After that, we propose and discuss a new research direction called Human Digital Twin, in which digital twins can create an intelligent and interactable avatar from the user's brain signals. We also present the challenges and potential solutions in synchronizing and communicating between virtual and physical entities in the Metaverse.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Howe Yuan Zhu (1 paper)
  2. Nguyen Quang Hieu (16 papers)
  3. Dinh Thai Hoang (125 papers)
  4. Diep N. Nguyen (86 papers)
  5. Chin-Teng Lin (78 papers)
Citations (7)

Summary

This survey paper addresses the potential of Brain-Computer Interface (BCI) technologies to enhance the user experience within the Metaverse, emphasizing a human-centric design approach. The authors posit that BCI offers a unique ability to enrich the Metaverse ecosystem through neural pathways, enabling applications such as cognitive state monitoring, avatar control, virtual interactions, and imagined speech communications.

The authors begin by outlining the fundamental background of the Metaverse and BCI technologies, highlighting the development of the Metaverse from early concepts in massively multiplayer online (MMO) games to its current state as a convergence of various technologies like Virtual Reality (VR)/ Extended Reality (XR), Digital Twin (DT), Artificial Intelligence (AI), and blockchain. They emphasize the importance of a human-centric design that uses a user's behavioral, psychological, physiological, and observational information to improve system performance and usability. They claim that conventional sensing techniques such as radio sensing, cameras, and wireless sensors can be utilized to develop a human-machine interface in the Metaverse. Face tracking, eye tracking, photogrammetry, computer vision, and motion capture can be used to construct fully immersive avatars.

The authors make the case that BCI began in 1875, and that BCI offers unique opportunities in the Metaverse that would be unachievable with conventional sensing:

  • Direct communication to the brain, bypassing peripheral motor-sensory systems.
  • Encoding of multimodal information onto a singular sensor.
  • Higher degrees of encoded information within the signal.
  • Intuitive control and natural interaction.
  • Universal access for users with disabilities.

They identify several challenges hindering the integration of BCI for Metaverse users:

  • The construction of virtual embodiments requiring multi-sensory data from the Metaverse users.
  • A lack of individualization for the Metaverse technology.
  • The need for real-time or near real-time processing and communications of BCI signals.
  • Limited knowledge of virtual embodiment.

To address the challenge of virtual embodiment, the authors propose the concept of a Human Digital Twin (HDT) solution.

The authors discuss different types of BCI devices/sensors such as invasive, semi-invasive, and non-invasive. Non-invasive BCI devices, particularly those using Electroencephalography (EEG) and functional Near-Infrared Spectroscopy (fNIRS), are presented as the primary feasible solution for a portable, wearable device suitable for Metaverse users. They claim that EEG devices typically consist of wet and dry electrode systems and given real-world feasibility factors, EEG dry electrode BCI systems are currently ideal for Metaverse users. They also discuss signal processing techniques, observational information, and classifiers (AI) as key aspects of BCI systems.

The paper identifies immersion as a critical factor in the Metaverse user experience and explores methods to enhance it using BCI. Two potential methods of using BCI to enhance the user's immersion in the Metaverse that are explored are:

  • Emotional and cognitive state recognition.
  • Error-related neurological behaviors to correct anomalies.

The authors mention the Russell Circumplex model of affects, used to measure emotional states on a spectrum between arousal and valence, as well as the Yorkes-Dodson Law, which dictates the relationship between emotional arousal/stress and cognitive performance. Arousal is detected through changes in brainwaves in the brain's frontal region whereas valence level is measured through the brain's hemispherical symmetry/asymmetry. Also, cognitive state features are extracted by evaluating the brainwaves of specific brain regions with the theta activity in the frontal cortex often determining mental workload. They also posit that VR sickness and falling indicators can be trained through an AI classifier to detect anomalous events in real time during a Metaverse experience.

The paper explores active BCI paradigms such as P300, Motor Imagery (MI), and Steady-State Visual Evoked Potential (SSVEP) to facilitate more intuitive modes of user interaction within the Metaverse. A P300 paradigm features an oddball design where the user has a target and several non-target stimuli whereas an MI paradigm utilizes the thought of left and right motor action to create a simple control paradigm. They claim that the SSVEP paradigm is popular with multiple visual flashing stimuli that flicker at specific frequencies. The frequency of the occipital region's activity can be used to determine which control option the user is focused on or targeting. They also discuss using imagined speech enabled by BCI to perform thought-based social interactions.

The authors introduce the concept of HDT as the ultimate Metaverse BCI prosthesis, creating a stable population of Metaverse users that maintains continuity between the Metaverse and the real world. They mention that key technologies to enable the HDT within the Metaverse include BCI, wearable biosensors such as heart rate, muscle, Inertial Measurement Unit (IMU), smartphones, and AI. The interactions between a real-world user and the HDT is described in four scenarios: User- HDT synchronization, HDT-Avatar Interaction, HDT-HDT Interaction, and User-HDT Replacement. They consider the challenges for the development of the HDT in the Metaverse by considering the two main perspectives that are (i) communications between BCI headsets and other infrastructures in the physical world and (ii) communications between human avatars and other avatars or technologies/virtual services in the Metaverse. They claim that 6G systems can utilize broadband communication techniques such as millimeter wave (mmWave) and Terahertz to enhance data transmission rate further. Recent advances in machine learning and wireless communication techniques enable the transmission beyond Shannon bound with semantic compression and semantic communication.

The survey concludes by addressing the open issues regarding the usability of BCI for the Metaverse, ethics, and security. Key challenges are:

  • Hardware development to extract high-quality brain signals.
  • Software development to address neurodiversity.
  • Security and privacy concerns related to the collection and potential manipulation of brain data.
  • Ethical considerations regarding the interpretation and use of brain signals.

The authors mention emerging applications and future research directions such as integrated VR-BCI devices, multitasking in the Metaverse, machine learning for processing heterogeneous datasets, and Human Digital Twin for maintaining continuity in the Metaverse.