Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
194 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

On Task and in Sync: Examining the Relationship between Gaze Synchrony and Self-Reported Attention During Video Lecture Learning (2404.00333v1)

Published 30 Mar 2024 in cs.HC

Abstract: Successful learning depends on learners' ability to sustain attention, which is particularly challenging in online education due to limited teacher interaction. A potential indicator for attention is gaze synchrony, demonstrating predictive power for learning achievements in video-based learning in controlled experiments focusing on manipulating attention. This study (N=84) examines the relationship between gaze synchronization and self-reported attention of learners, using experience sampling, during realistic online video learning. Gaze synchrony was assessed through Kullback-Leibler Divergence of gaze density maps and MultiMatch algorithm scanpath comparisons. Results indicated significantly higher gaze synchronization in attentive participants for both measures and self-reported attention significantly predicted post-test scores. In contrast, synchrony measures did not correlate with learning outcomes. While supporting the hypothesis that attentive learners exhibit similar eye movements, the direct use of synchrony as an attention indicator poses challenges, requiring further research on the interplay of attention, gaze synchrony, and video content type.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (43)
  1. Differential privacy for eye tracking with temporal correlations. PLOS ONE 16, 8 (2021), 1–22. https://doi.org/10.1371/journal.pone.0255979
  2. Exploiting object-of-interest information to understand attention in VR classrooms. In 2021 IEEE Virtual Reality and 3D User Interfaces (VR). IEEE, 597–605.
  3. Benjamin T Carter and Steven G Luke. 2020. Best practices in eye tracking research. International Journal of Psychophysiology 155 (2020), 49–62.
  4. Pupil diameter differentiates expertise in dental radiography visual search. PLoS ONE 15, 5 (May 2020). https://doi.org/10.1371/journal.pone.0223941
  5. Scanpath comparison in medical image reading skills of dental students: distinguishing stages of expertise development. In Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications. 1–9.
  6. A privacy-preserving approach to streaming eye-tracking data. IEEE Transactions on Visualization and Computer Graphics 27, 5 (2021), 2555–2565. https://doi.org/10.1109/TVCG.2021.3067787
  7. It depends on how you look at it: Scanpath comparison in multiple dimensions with MultiMatch, a vector-based approach. Behavior research methods 44 (2012), 1079–1100.
  8. Variability of eye movements when viewing dynamic natural scenes. Journal of Vision 10, 10 (2010), 28. https://doi.org/10.1167/10.10.28
  9. Everyday attention and lecture retention: the effects of time, fidgeting, and mind wandering. Frontiers in psychology 4 (2013), 619.
  10. Comparing scanpaths during scene encoding and recognition: A multi-dimensional approach. Journal of Eye Movement Research 5, 4 (2012), 1–14.
  11. Digital transformations of classrooms in virtual reality. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems. 1–10.
  12. Attentive or not? Toward a machine learning approach to assessing students’ visible engagement in classroom instruction. Educational Psychology Review 33 (2021), 27–49.
  13. Where people look when watching movies: Do all viewers look at the same place? Computers in biology and medicine 37, 7 (2007), 957–964.
  14. A vector-based, multidimensional scanpath similarity measure. In Proceedings of the 2010 symposium on eye-tracking research & applications. 211–218.
  15. Learning to see: Guiding students’ attention via a model’s eye movements fosters learning. Learning and Instruction 25 (2013), 62–70.
  16. A combined experimental and individual-differences investigation into mind wandering during a video lecture. Journal of Experimental Psychology: General 146, 11 (2017), 1649.
  17. Did You Get That? Predicting Learners’ Comprehension of a Video Lecture from Visualizations of Their Gaze Data. Cognitive Science 47, 2 (2023), e13247. https://doi.org/10.1111/cogs.13247
  18. What Does Your Gaze Reveal About You? On the Privacy Implications of Eye Tracking. Springer International Publishing, 226–241. https://doi.org/10.1007/978-3-030-42504-3_15
  19. Olivier Le Meur and Thierry Baccino. 2013. Methods for comparing scanpaths and saliency maps: strengths and weaknesses. Behavior research methods 45, 1 (2013), 251–266.
  20. Daniel J. Liebling and Sören Preibusch. 2014. Privacy Considerations for a Pervasive Eye Tracking World. In Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct Publication. ACM, New York, NY, USA, 1169–1177. https://doi.org/10.1145/2638728.2641688
  21. Sophie I Lindquist and John P McLean. 2011. Daydreaming and its correlates in an educational environment. Learning and Individual Differences 21, 2 (2011), 158–167.
  22. Using synchronized eye movements to assess attentional engagement. Psychological Research 87, 7 (2023), 2039–2047. https://doi.org/10.1007/s00426-023-01791-2
  23. Synchronized eye movements predict test scores in online video education. Proceedings of the National Academy of Sciences of the United States of America 118, 5 (2021), e2016980118. https://doi.org/10.1073/pnas.2016980118
  24. Point-of-gaze analysis reveals visual search strategies. In Human vision and electronic imaging IX, Vol. 5292. SPIE, 296–306.
  25. Can Eye Movement Synchronicity Predict Test Performance With Unreliably-Sampled Data in an Online Learning Context?. In 2022 Symposium on Eye Tracking Research and Applications. ACM, New York, NY, USA. https://doi.org/10.1145/3517031.3529239
  26. Behind the Screens: Exploring Eye Movement Visualization to Optimize Online Teaching and Learning. In Proceedings of Mensch und Computer 2023. 67–80.
  27. Distance between gaze and laser pointer predicts performance in video-based e-learning independent of the presence of an on-screen instructor. In 2022 Symposium on Eye Tracking Research and Applications (ACM Digital Library), Frederick Shic (Ed.). Association for Computing Machinery, New York,NY,United States, 1–10. https://doi.org/10.1145/3517031.3529620
  28. Determining comprehension and quality of TV programs using eye-gaze tracking. Pattern Recognition 41, 5 (2008), 1610–1626.
  29. Daniel L Schacter and Karl K Szpunar. 2015. Enhancing attention and memory during video-recorded lectures. Scholarship of Teaching and Learning in Psychology 1, 1 (2015), 60.
  30. H.R. Schiffman. 2001. Sensation and Perception. An Integrated Approach. John Wiley and Sons.
  31. Sensomotoric Instruments. 2017. BeGaze Manual, Version 3.7.
  32. Tim J Smith and Parag K Mital. 2013. Attentional synchrony and the influence of viewing task on gaze behavior in static and dynamic scenes. Journal of vision 13, 8 (2013), 16–16.
  33. Automated Anonymisation of Visual and Audio Data in Classroom Studies. https://doi.org/10.48550/arXiv.2001.05080 The Workshops of the Thirty-Fourth AAAI Conference on Artificial Intelligence.
  34. Teachers’ perception in the classroom. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition Workshops. 2315–2324.
  35. How presenters perceive and react to audience flow prediction in-situ: An explorative study of live online lectures. Proceedings of the ACM on Human-Computer Interaction 3, CSCW (2019), 1–19.
  36. Visual correlates of fixation selection: Effects of scale and time. Vision research 45, 5 (2005), 643–659.
  37. multimatch-gaze: The MultiMatch algorithm for gaze path comparison in Python. Journal of Open Source Software 4, 40 (2019), 1525.
  38. Disengagement during lectures: Media multitasking and mind wandering in university classrooms. Computers & Education 132 (2019), 76–89.
  39. Jeffrey D Wammes and Daniel Smilek. 2017. Examining the influence of lecture format on degree of mind wandering. Journal of Applied Research in Memory and Cognition 6, 2 (2017), 174–184.
  40. Yana Weinstein. 2018. Mind-wandering, how do I measure thee with probes? Let me count the ways. Behavior research methods 50 (2018), 642–661.
  41. Dynamics of visual attention during online lectures-evidence from webcam eye tracking. In EdMedia+ Innovate Learning. Association for the Advancement of Computing in Education (AACE), 1220–1230.
  42. Eye movements of large populations: I. Implementation and performance of an autonomous public eye tracker. Behavior Research Methods, Instruments, & Computers 34 (2002), 509–517.
  43. Wandering eyes: Eye movements during mind wandering in video lectures. Applied Cognitive Psychology 34, 2 (2020), 449–464.
Citations (3)

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com