Towards Attention-Aware Large Language Models: Integrating Real-Time Eye-Tracking and EEG for Adaptive AI Responses (2511.06468v1)
Abstract: This project proposes an attention-aware LLM that integrates EEG and eye tracking to monitor and measure user attention dynamically. To realize this, the project will integrate real-time EEG and eye-tracking data into an LLM-based interactive system and classify the user's attention state on the fly. The system can identify five attention states: High Attention, Stable Attention, Dropping Attention, Cognitive Overload, and Distraction. It responds accordingly to each state, with a particular focus on adapting to decreased attention, distraction, and cognitive overload to improve user engagement and reduce cognitive load.
Sponsored by Paperpile, the PDF & BibTeX manager trusted by top AI labs.
Get 30 days freePaper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.