Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
184 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

iFace: Hand-Over-Face Gesture Recognition Leveraging Impedance Sensing (2403.18433v1)

Published 27 Mar 2024 in cs.HC

Abstract: Hand-over-face gestures can provide important implicit interactions during conversations, such as frustration or excitement. However, in situations where interlocutors are not visible, such as phone calls or textual communication, the potential meaning contained in the hand-over-face gestures is lost. In this work, we present iFace, an unobtrusive, wearable impedance-sensing solution for recognizing different hand-over-face gestures. In contrast to most existing works, iFace does not require the placement of sensors on the user's face or hands. Instead, we proposed a novel sensing configuration, the shoulders, which remains invisible to both the user and outside observers. The system can monitor the shoulder-to-shoulder impedance variation caused by gestures through electrodes attached to each shoulder. We evaluated iFace in a user study with eight participants, collecting six kinds of hand-over-face gestures with different meanings. Using a convolutional neural network and a user-dependent classification, iFace reaches 82.58 \% macro F1 score. We discuss potential application scenarios of iFace as an implicit interaction interface.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (36)
  1. Patricia S. Abril and Robert Plant. 2007. The patent holder’s dilemma: Buy, sell, or troll? Commun. ACM 50, 1 (Jan. 2007), 36–44. https://doi.org/10.1145/1188913.1188915
  2. A hierarchical approach in food and drink intake recognition using wearable inertial sensors. In Proceedings of the 11th PErvasive Technologies Related to Assistive Environments Conference. 552–557.
  3. Multi-frequency bioimpedance in human muscle assessment. Physiological Reports 3, 4 (2015), e12354.
  4. InMyFace: Inertial and mechanomyography-based sensor fusion for wearable facial activity recognition. Information Fusion (2023), 101886.
  5. iRotate: automatic screen rotation based on face orientation. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. 2203–2210.
  6. I feel like I’m taking selfies all day! Towards understanding biometric authentication on smartphones. In Proceedings of the 33rd annual ACM conference on human factors in computing systems. 1411–1414.
  7. Towards artefact-free bio-impedance measurements: evaluation, identification and suppression of artefacts at multiple frequencies. IEEE Sensors Journal 22, 1 (2021), 589–600.
  8. Emotion recognition by heart rate variability. Australian Journal of Basic and Applied Science 8, 14 (2014), 50–55.
  9. Norbert Freedman. 1977. Hands, words, and mind: On the structuralization of body movements during discourse and the capacity for verbal representation. In Communicative structures and psychic structures: A psychoanalytic interpretation of communication. Springer, 109–132.
  10. Behzad Hasani and Mohammad H Mahoor. 2017. Facial expression recognition using enhanced deep 3D convolutional neural networks. In Proceedings of the IEEE conference on computer vision and pattern recognition workshops. 30–40.
  11. HeartChat: Heart rate augmented mobile chat to support empathy and awareness. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems. 2239–2251.
  12. EngageMeter: A system for implicit audience engagement sensing using electroencephalography. In Proceedings of the 2017 Chi conference on human factors in computing systems. 5114–5119.
  13. Ali Mehmood Khan and Michael Lawo. 2016. Recognizing emotion from blood volume pulse and skin conductance sensor using machine learning algorithms. In XIV Mediterranean Conference on Medical and Biological Engineering and Computing 2016: MEDICON 2016, March 31st-April 2nd 2016, Paphos, Cyprus. Springer, 1297–1303.
  14. Itchy nose: Discreet gesture interaction using EOG sensors in smart eyewear. In Proceedings of the 2017 ACM International Symposium on Wearable Computers. 94–97.
  15. The fusion of electroencephalography and facial expression for continuous emotion recognition. IEEE Access 7 (2019), 155724–155736.
  16. Shan Li and Weihong Deng. 2020. Deep facial expression recognition: A survey. IEEE transactions on affective computing 13, 3 (2020), 1195–1215.
  17. D-Touch: Recognizing and Predicting Fine-grained Hand-face Touching Activities Using a Neck-mounted Wearable. In Proceedings of the 28th International Conference on Intelligent User Interfaces. 569–583.
  18. iMove: Exploring Bio-impedance Sensing for Fitness Activity Recognition. arXiv preprint arXiv:2402.09445 (2024).
  19. iEat: Human-food interaction with bio-impedance sensing. In Adjunct Proceedings of the 2023 ACM International Joint Conference on Pervasive and Ubiquitous Computing & the 2023 ACM International Symposium on Wearable Computing. 207–207.
  20. Hand-over-face input sensing for interaction with smartphones through the built-in camera. In Proceedings of the 21st International Conference on Human-Computer Interaction with Mobile Devices and Services. 1–12.
  21. Learning from the ubiquitous language: an empirical analysis of emoji usage of smartphone users. In Proceedings of the 2016 ACM international joint conference on pervasive and ubiquitous computing. 770–780.
  22. Automatic analysis of naturalistic hand-over-face gestures. ACM Transactions on Interactive Intelligent Systems (TiiS) 6, 2 (2016), 1–18.
  23. Facerubbing: Input technique by rubbing face using optical sensors on smart eyewear for facial expression recognition. In Proceedings of the 9th Augmented Human International Conference. 1–5.
  24. Botential: Localizing on-body gestures by measuring electrical signatures on the human skin. In Proceedings of the 17th International Conference on Human-Computer Interaction with Mobile Devices and Services. 207–216.
  25. Niti Naik and Mayuri A Mehta. 2020. An improved method to recognize hand-over-face gesture based facial emotion using convolutional neural network. In 2020 IEEE International Conference on Electronics, Computing and Communication Technologies (CONECCT). IEEE, 1–6.
  26. Barbara Pease and Allan Pease. 2008. The definitive book of body language: The hidden meaning behind people’s gestures and expressions. Bantam.
  27. A review on automatic facial expression recognition systems assisted by multimodal sensor data. Sensors 19, 8 (2019), 1863.
  28. Exploring the use of hand-to-face input for interacting with head-worn displays. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. 3181–3190.
  29. Boiling mind: Amplifying the audience-performer connection through sonification and visualization of heart and electrodermal activities. In Proceedings of the Fifteenth International Conference on Tangible, Embedded, and Embodied Interaction. 1–10.
  30. Heartefacts: augmenting mobile video sharing using wrist-worn heart rate sensors. In Proceedings of the 2016 ACM Conference on Designing Interactive Systems. 712–723.
  31. Facesight: Enabling hand-to-face gesture interaction on ar glasses with a downward-facing camera vision. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems. 1–14.
  32. Computing emotion awareness through galvanic skin response and facial electromyography. In Probing experience: From assessment of user emotions and behaviour to development of products. Springer, 149–162.
  33. Earbuddy: Enabling on-face interaction via wireless earbuds. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems. 1–14.
  34. CheekInput: turning your cheek into an input surface by embedded optical sensors on a head-mounted display. In Proceedings of the 23rd ACM Symposium on Virtual Reality Software and Technology. 1–8.
  35. Privatetalk: Activating voice input with hand-on-mouth gesture detected by bluetooth earphones. In Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology. 1013–1020.
  36. Augmenting mobile phone interaction with face-engaged gestures. arXiv preprint arXiv:1610.00214 (2016).
Citations (2)

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets