- The paper surveys emerging Natural User Interfaces (NUIs) in mobile and wearable computing, highlighting challenges with traditional inputs and the growing need for intuitive interaction in AR/MR/VR contexts.
- It explores various NUI modalities like speech, gesture, gaze, and Brain-Computer Interfaces, detailing underlying biological signals (EEG, EOG, EMG) and their acquisition techniques.
- The survey discusses portable hardware solutions, practical applications in assistive technology, industry, and security, and identifies key open problems for future research, including signal fidelity and privacy.
The paper "Emerging Natural User Interfaces in Mobile Computing: A Bottoms-Up Survey" provides a thorough examination of Natural User Interfaces (NUIs) within the context of mobile computing and wearable technology. The authors explore the constraints of current input methods in mobile and wearable devices, emphasizing the need for new interaction paradigms, particularly with the increasing prevalence of applications in augmented, mixed, and virtual reality (AR/MR/VR).
Key Areas Discussed
Motivation and Challenges
The paper identifies several challenges in current mobile interactions, such as limited screen real estate and input bandwidth. Traditional touch-based inputs are often disruptive to users' ongoing activities and ill-suited for complex virtual environments, especially in AR/MR/VR contexts. The authors argue that leveraging NUIs can address these limitations by using intuitive, natural actions for user interaction.
Biological Foundations and Signal Acquisition
The biology underlying various NUI modalities includes electrical neural activities, muscle signals (myoelectricity), and ocular movements. These are captured via different signal acquisition techniques:
- Electroencephalography (EEG): Measures electrical activity in the brain with high temporal resolution.
- Electrooculography (EOG): Tracks eye movements by measuring the corneo-retinal standing potential.
- Electromyography (EMG): Quantifies muscular electric potentials, particularly useful in gesture recognition.
NUI Modality Types
The survey highlights several NUI modalities:
- Speech Interfaces: Includes silent speech interfaces to address privacy and usability concerns.
- Gesture Inputs: Utilizes motion-based inputs, although mobile applicability is fraught with interference challenges.
- Gaze Interaction: Relies on EOG and camera-based systems to track eye movements for input.
- Brain-Computer Interfaces (BCIs): Provides direct neural input, categorized into attention-based systems like Steady State Visually Evoked Potentials (SSVEPs) and endogenous systems like Motor Imagery (MI).
Hardware Solutions and Challenges
The paper explores portable NUI hardware, providing an in-depth overview of commercial and open-source devices for EEG, fNIRS, and EMG acquisitions. It discusses the limitations such as signal noise, spatial-temporal resolutions, and the ergonomic challenges of combining these devices with mobile outputs like AR headsets.
Applications and Use Cases
- Assistive Technology: BCIs and EMG NUIs facilitate communication for individuals with disabilities.
- Hybrid Interfaces: Combined modalities address limitations, augment usability, and increase Information Transfer Rates (ITRs).
- Silent Speech Interfaces: Promises robust communication in noisy environments without audible speech.
- Biometrics and Security: NUIs offer novel secure biometric authentication methods.
- Industrial and Commercial Applications: Enhances hands-free operations in maintenance, training, and e-commerce.
Future Directions and Open Problems
The paper addresses several open research problems, including improving signal fidelity and classification accuracy, overcoming the intrinsic "Midas touch" problem of unintentional inputs, and addressing ethical concerns like privacy in bio-signal data usage. Future work is suggested in redefining user interfaces and leveraging deep learning for bio-signal classification to enhance user experiences and content presentation, particularly in immersive environments like AR/MR/VR.
Through comprehensive coverage of the current state and future potential, this survey serves as a valuable resource for researchers exploring the development and application of NUIs in mobile interactions.