Temporal Persistence and Intercorrelation of Embeddings Learned by an End-to-End Deep Learning Eye Movement-driven Biometrics Pipeline (2402.16399v2)
Abstract: What qualities make a feature useful for biometric performance? In prior research, pre-dating the advent of deep learning (DL) approaches to biometric analysis, a strong relationship between temporal persistence, as indexed by the intraclass correlation coefficient (ICC), and biometric performance (Equal Error Rate, EER) was noted. More generally, the claim was made that good biometric performance resulted from a relatively large set of weakly intercorrelated features with high ICC. The present study aimed to determine whether the same relationships are found in a state-of-the-art DL-based eye movement biometric system (``Eye-Know-You-Too''), as applied to two publicly available eye movement datasets. To this end, we manipulate various aspects of eye-tracking signal quality, which produces variation in biometric performance, and relate that performance to the temporal persistence and intercorrelation of the resulting embeddings. Data quality indices were related to EER with either linear or logarithmic fits, and the resulting model R2 was noted. As a general matter, we found that temporal persistence was an important predictor of DL-based biometric performance, and also that DL-learned embeddings were generally weakly intercorrelated.
- R. Alrawili, A. A. S. AlQahtani, and M. K. Khan, “Comprehensive survey: Biometric user authentication application, evaluation, and discussion,” arXiv preprint arXiv:2311.13416, 2023.
- L. Friedman, M. S. Nixon, and O. V. Komogortsev, “Method to assess the temporal persistence of potential biometric features: Application to oculomotor, gait, face and brain structure databases,” PLOS ONE, vol. 12, no. 6, pp. 1–42, 06 2017. [Online]. Available: https://doi.org/10.1371/journal.pone.0178501
- L. Friedman, H. S. Stern, L. R. Price, and O. V. Komogortsev, “Why temporal persistence of biometric features, as assessed by the intraclass correlation coefficient, is so valuable for classification performance,” Sensors, vol. 20, no. 16, p. 4555, 2020.
- P. Kasprowski and J. Ober, “Eye movements in biometrics,” Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), vol. 3087, pp. 248–258, 2004. [Online]. Available: https://doi.org/10.1007/978-3-540-25976-3_23
- C. Katsini, Y. Abdrabou, G. E. Raptis, M. Khamis, and F. Alt, “The Role of Eye Gaze in Security and Privacy Applications: Survey and Future HCI Research Directions,” in Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, ser. CHI ’20. Honolulu, HI, USA: Association for Computing Machinery, Apr. 2020, pp. 1–21. [Online]. Available: https://doi.org/10.1145/3313831.3376840
- G. Bargary, J. M. Bosten, P. T. Goodbourn, A. J. Lawrance-Owen, R. E. Hogg, and J. D. Mollon, “Individual differences in human eye movements: An oculomotor signature?” Vision Research, vol. 141, pp. 157–169, Dec. 2017. [Online]. Available: http://www.sciencedirect.com/science/article/pii/S0042698917300391
- M. Nilsson Benfatto, G. Öqvist Seimyr, J. Ygge, T. Pansell, A. Rydberg, and C. Jacobson, “Screening for dyslexia using eye tracking during reading,” PloS one, vol. 11, no. 12, p. e0165508, 2016.
- L. Billeci, A. Narzisi, A. Tonacci, B. Sbriscia-Fioretti, L. Serasini, F. Fulceri, F. Apicella, F. Sicca, S. Calderoni, and F. Muratori, “An integrated eeg and eye-tracking approach for the study of responding and initiating joint attention in autism spectrum disorders,” Scientific Reports, vol. 7, no. 1, p. 13560, 2017.
- B. A. Sargezeh, N. Tavakoli, and M. R. Daliri, “Gender-based eye movement differences in passive indoor picture viewing: An eye-tracking study,” Physiology & behavior, vol. 206, pp. 43–50, 2019.
- S. M. K. Al Zaidawi, M. H. Prinzler, C. Schröder, G. Zachmann, and S. Maneth, “Gender classification of prepubescent children via eye movements with reading stimuli,” in Companion Publication of the 2020 International Conference on Multimodal Interaction, 2020, pp. 1–6.
- C. Schröder, S. M. K. Al Zaidawi, M. H. Prinzler, S. Maneth, and G. Zachmann, “Robustness of eye movement biometrics against varying stimuli and varying trajectory length,” in Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, 2020, pp. 1–7.
- I. Rigas and O. V. Komogortsev, “Current research in eye movement biometrics: An analysis based on bioeye 2015 competition,” Image and Vision Computing, vol. 58, pp. 129–141, 2017.
- S. Eberz, K. Rasmussen, V. Lenders, and I. Martinovic, “Preventing lunchtime attacks: Fighting insider threats with eye movement biometrics,” in Network and Distributed System Security (NDSS) Symposium. Internet Society, 2015. [Online]. Available: http://dx.doi.org/10.14722/ndss.2015.23203
- O. V. Komogortsev, A. Karpov, and C. D. Holland, “Attack of mechanical replicas: Liveness detection with eye movements,” IEEE Transactions on Information Forensics and Security, vol. 10, no. 4, pp. 716–725, 2015. [Online]. Available: https://doi.org/10.1109/TIFS.2015.2405345
- I. Rigas and O. V. Komogortsev, “Eye Movement-Driven Defense against Iris Print-Attacks,” Pattern Recogn. Lett., vol. 68, no. P2, p. 316–326, dec 2015. [Online]. Available: http://dx.doi.org/10.1016/j.patrec.2015.06.011
- M. H. Raju, D. J. Lohr, and O. Komogortsev, “Iris print attack detection using eye movement signals,” in 2022 Symposium on Eye Tracking Research and Applications, ser. ETRA ’22. New York, NY, USA: Association for Computing Machinery, 2022. [Online]. Available: https://doi.org/10.1145/3517031.3532521
- D. Lohr, H. Griffith, S. Aziz, and O. Komogortsev, “A metric learning approach to eye movement biometrics,” in 2020 IEEE International Joint Conference on Biometrics (IJCB). IEEE, 2020, pp. 1–7. [Online]. Available: http://dx.doi.org/10.1109/IJCB48548.2020.9304859
- D. J. Lohr, S. Aziz, and O. Komogortsev, “Eye movement biometrics using a new dataset collected in virtual reality,” in ACM Symposium on Eye Tracking Research and Applications, ser. ETRA ’20 Adjunct. New York, NY, USA: Association for Computing Machinery, 2020. [Online]. Available: https://doi.org/10.1145/3379157.3391420
- D. Lohr, H. Griffith, and O. V. Komogortsev, “Eye know you: Metric learning for end-to-end biometric authentication using eye movements from a longitudinal dataset,” IEEE Transactions on Biometrics, Behavior, and Identity Science, 2022.
- L. A. Jäger, S. Makowski, P. Prasse, S. Liehr, M. Seidler, and T. Scheffer, “Deep eyedentification: Biometric identification using micro-movements of the eye,” in Machine Learning and Knowledge Discovery in Databases: European Conference, ECML PKDD 2019, Würzburg, Germany, September 16–20, 2019, Proceedings, Part II. Springer, 2020, pp. 299–314.
- S. Makowski, P. Prasse, D. R. Reich, D. Krakowczyk, L. A. Jäger, and T. Scheffer, “Deepeyedentificationlive: Oculomotoric biometric identification and presentation-attack detection using deep neural networks,” IEEE Transactions on Biometrics, Behavior, and Identity Science, vol. 3, no. 4, pp. 506–518, 2021.
- D. Lohr and O. V. Komogortsev, “Eye know you too: Toward viable end-to-end eye movement biometrics for user authentication,” IEEE Transactions on Information Forensics and Security, vol. 17, pp. 3151–3164, 2022.
- A. R. A. Brasil, J. O. Andrade, and K. S. Komati, “Eye movements biometrics: A bibliometric analysis from 2004 to 2019,” arXiv preprint arXiv:2006.01310, 2020.
- Y. Zhang and X. Mou, “Survey on eye movement based authentication systems,” in Computer Vision: CCF Chinese Conference, CCCV 2015, Xi’an, China, September 18-20, 2015, Proceedings, Part I. Springer, 2015, pp. 144–159.
- Y. Zhang and M. Juhola, “On biometrics with eye movements,” IEEE journal of biomedical and health informatics, vol. 21, no. 5, pp. 1360–1366, 2016.
- I. Rigas and O. V. Komogortsev, “Current research in eye movement biometrics: An analysis based on BioEye 2015 competition,” Image and Vision Computing, vol. 58, pp. 129–141, 2017.
- R. Andersson, L. Larsson, K. Holmqvist, M. Stridh, and M. Nyström, “One algorithm to rule them all? An evaluation and discussion of ten eye movement event-detection algorithms,” Behavior Research Methods, vol. 49, no. 2, pp. 616–637, 2017. [Online]. Available: https://doi.org/10.3758/s13428-016-0738-9
- M. Nyström and K. Holmqvist, “An adaptive algorithm for fixation, saccade, and glissade detection in eyetracking data,” Behavior research methods, vol. 42, no. 1, pp. 188–204, 2010.
- J. Pekkanen and O. Lappi, “A new and general approach to signal denoising and eye movement classification based on segmented linear regression,” Scientific reports, vol. 7, no. 1, p. 17726, 2017.
- A. H. Dar, A. S. Wagner, and M. Hanke, “Remodnav: robust eye-movement classification for dynamic stimulation,” Behavior research methods, vol. 53, no. 1, pp. 399–414, 2021.
- L. Friedman, I. Rigas, E. Abdulin, and O. V. Komogortsev, “A novel evaluation of two related and two independent algorithms for eye movement classification during reading,” Behavior Research Methods, vol. 50, no. 4, pp. 1374–1397, 08 2018. [Online]. Available: https://doi.org/10.3758/s13428-018-1050-7
- C. Li, J. Xue, C. Quan, J. Yue, and C. Zhang, “Biometric recognition via texture features of eye movement trajectories in a visual searching task,” PLoS ONE, vol. 13, no. 4, p. e0194475, apr 2018. [Online]. Available: https://dx.plos.org/10.1371/journal.pone.0194475
- I. Rigas, G. Economou, and S. Fotopoulos, “Biometric identification based on the eye movements and graph matching techniques,” Pattern Recognition Letters, vol. 33, no. 6, pp. 786–792, 2012.
- I. Rigas, O. Komogortsev, and R. Shadmehr, “Biometric recognition via eye movements: Saccadic vigor and acceleration cues,” ACM Transactions on Applied Perception (TAP), vol. 13, no. 2, pp. 1–21, 2016.
- C. D. Holland and O. V. Komogortsev, “Complex eye movement pattern biometrics: Analyzing fixations and saccades,” in 2013 International conference on biometrics (ICB). IEEE, 2013, pp. 1–8.
- A. George and A. Routray, “A score level fusion method for eye movement biometrics,” Pattern Recognition Letters, vol. 82, pp. 207–215, 2016.
- S. Jia, D. H. Koh, A. Seccia, P. Antonenko, R. Lamb, A. Keil, M. Schneps, and M. Pomplun, “Biometric recognition through eye movements using a recurrent neural network,” in Proceedings - 9th IEEE International Conference on Big Knowledge, ICBK 2018. Institute of Electrical and Electronics Engineers Inc., dec 2018, pp. 57–64.
- A. Abdelwahab and N. Landwehr, “Deep distributional sequence embeddings based on a wasserstein loss,” Neural Processing Letters, vol. 54, no. 5, pp. 3749–3769, 2022.
- H. Griffith, D. Lohr, E. Abdulin, and O. Komogortsev, “Gazebase, a large-scale, multi-stimulus, longitudinal eye movement dataset,” Scientific Data, vol. 8, no. 1, p. 184, 2021.
- D. Lohr, S. Aziz, L. Friedman, and O. V. Komogortsev, “Gazebasevr, a large-scale, longitudinal, binocular eye-tracking dataset collected in virtual reality,” Scientific Data, vol. 10, no. 1, 2023.
- A. Savitzky and M. J. Golay, “Smoothing and differentiation of data by simplified least squares procedures.” Analytical chemistry, vol. 36, no. 8, pp. 1627–1639, 1964.
- G. Huang, Z. Liu, L. Van Der Maaten, and K. Q. Weinberger, “Densely connected convolutional networks,” in Proceedings of the IEEE conference on computer vision and pattern recognition, 2017, pp. 4700–4708.
- D. P. Kingma and J. Ba, “Adam: A method for stochastic optimization,” arXiv preprint arXiv:1412.6980, 2014.
- L. N. Smith and N. Topin, “Super-convergence: Very fast training of neural networks using large learning rates,” in Artificial intelligence and machine learning for multi-domain operations applications, vol. 11006. SPIE, 2019, pp. 369–386.
- X. Wang, X. Han, W. Huang, D. Dong, and M. R. Scott, “Multi-similarity loss with general pair weighting for deep metric learning,” in 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), 2019, pp. 5017–5025.
- A. P. Field, “Kendall’s coefficient of concordance,” Encyclopedia of Statistics in Behavioral Science, vol. 2, pp. 1010–11, 2005.
- K. Musgrave, S. Belongie, and S.-N. Lim, “Pytorch metric learning,” 2020.
- D. J. Lohr, L. Friedman, and O. V. Komogortsev, “Evaluating the data quality of eye tracking signals from a virtual reality system: Case study using smi’s eye-tracking htc vive,” arXiv preprint arXiv:1912.02083, 2019.