Chaos in Motion: Unveiling Robustness in Remote Heart Rate Measurement through Brain-Inspired Skin Tracking (2404.07687v1)
Abstract: Heart rate is an important physiological indicator of human health status. Existing remote heart rate measurement methods typically involve facial detection followed by signal extraction from the region of interest (ROI). These SOTA methods have three serious problems: (a) inaccuracies even failures in detection caused by environmental influences or subject movement; (b) failures for special patients such as infants and burn victims; (c) privacy leakage issues resulting from collecting face video. To address these issues, we regard the remote heart rate measurement as the process of analyzing the spatiotemporal characteristics of the optical flow signal in the video. We apply chaos theory to computer vision tasks for the first time, thus designing a brain-inspired framework. Firstly, using an artificial primary visual cortex model to extract the skin in the videos, and then calculate heart rate by time-frequency analysis on all pixels. Our method achieves Robust Skin Tracking for Heart Rate measurement, called HR-RST. The experimental results show that HR-RST overcomes the difficulty of environmental influences and effectively tracks the subject movement. Moreover, the method could extend to other body parts. Consequently, the method can be applied to special patients and effectively protect individual privacy, offering an innovative solution.
- T. A. Manzone, H. Q. Dam, D. Soltis, and V. V. Sagar, “Blood volume analysis: a new technique and new clinical interest reinvigorate a classic study,” Journal of nuclear medicine technology, vol. 35, no. 2, pp. 55–63, 2007.
- E. Lee, E. Chen, and C.-Y. Lee, “Meta-rppg: Remote heart rate estimation using a transductive meta-learner,” in Computer Vision–ECCV 2020: 16th European Conference, Glasgow, UK, August 23–28, 2020, Proceedings, Part XXVII 16. Springer, 2020, pp. 392–409.
- H. Lu, H. Han, and S. K. Zhou, “Dual-gan: Joint bvp and noise modeling for remote physiological measurement,” in Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, 2021, pp. 12 404–12 413.
- Z. Yu, Y. Shen, J. Shi, H. Zhao, P. H. Torr, and G. Zhao, “Physformer: Facial video-based physiological measurement with temporal difference transformer,” in Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, 2022, pp. 4186–4196.
- H. Zhou, Y. Yuan, and C. Shi, “Object tracking using sift features and mean shift,” Computer vision and image understanding, vol. 113, no. 3, pp. 345–352, 2009.
- V. Mirjalili, S. Raschka, and A. Ross, “Privacynet: Semi-adversarial networks for multi-attribute face privacy,” IEEE Transactions on Image Processing, vol. 29, pp. 9400–9412, 2020.
- J. Kranjec, S. Beguš, J. Drnovšek, and G. Geršak, “Novel methods for noncontact heart rate measurement: A feasibility study,” IEEE transactions on instrumentation and measurement, vol. 63, no. 4, pp. 838–847, 2013.
- K. V. Madhav, M. R. Ram, E. H. Krishna, N. R. Komalla, and K. A. Reddy, “Robust extraction of respiratory activity from ppg signals using modified mspca,” IEEE Transactions on Instrumentation and Measurement, vol. 62, no. 5, pp. 1094–1106, 2013.
- S. Nabavi and S. Bhadra, “A robust fusion method for motion artifacts reduction in photoplethysmography signal,” IEEE Transactions on Instrumentation and Measurement, vol. 69, no. 12, pp. 9599–9608, 2020.
- N. Mora, F. Cocconcelli, G. Matrella, and P. Ciampolini, “Fully automated annotation of seismocardiogram for noninvasive vital sign measurements,” IEEE Transactions on Instrumentation and Measurement, vol. 69, no. 4, pp. 1241–1250, 2019.
- S. Chaichulee, M. Villarroel, J. Jorge, C. Arteta, G. Green, K. McCormick, A. Zisserman, and L. Tarassenko, “Multi-task convolutional neural network for patient detection and skin segmentation in continuous non-contact vital sign monitoring,” in 2017 12th IEEE International Conference on Automatic Face & Gesture Recognition (FG 2017). IEEE, 2017, pp. 266–272.
- A. Ni, A. Azarang, and N. Kehtarnavaz, “A review of deep learning-based contactless heart rate measurement methods,” Sensors, vol. 21, no. 11, p. 3719, 2021.
- M. Lewandowska, J. Rumiński, T. Kocejko, and J. Nowak, “Measuring pulse rate with a webcam—a non-contact method for evaluating cardiac activity,” in 2011 federated conference on computer science and information systems (FedCSIS). IEEE, 2011, pp. 405–410.
- M.-Z. Poh, D. J. McDuff, and R. W. Picard, “Advancements in noncontact, multiparameter physiological measurements using a webcam,” IEEE transactions on biomedical engineering, vol. 58, no. 1, pp. 7–11, 2010.
- R. Macwan, Y. Benezeth, and A. Mansouri, “Heart rate estimation using remote photoplethysmography with multi-objective optimization,” Biomedical Signal Processing and Control, vol. 49, pp. 24–33, 2019.
- G. De Haan and V. Jeanne, “Robust pulse rate from chrominance-based rppg,” IEEE Transactions on Biomedical Engineering, vol. 60, no. 10, pp. 2878–2886, 2013.
- W. Wang, A. C. Den Brinker, S. Stuijk, and G. De Haan, “Algorithmic principles of remote ppg,” IEEE Transactions on Biomedical Engineering, vol. 64, no. 7, pp. 1479–1491, 2016.
- X. Li, J. Chen, G. Zhao, and M. Pietikainen, “Remote heart rate measurement from face videos under realistic situations,” in Proceedings of the IEEE conference on computer vision and pattern recognition, 2014, pp. 4264–4271.
- S. Tulyakov, X. Alameda-Pineda, E. Ricci, L. Yin, J. F. Cohn, and N. Sebe, “Self-adaptive matrix completion for heart rate estimation from face videos under realistic conditions,” in Proceedings of the IEEE conference on computer vision and pattern recognition, 2016, pp. 2396–2404.
- G.-S. Hsu, A. Ambikapathi, and M.-S. Chen, “Deep learning with time-frequency representation for pulse estimation from facial videos,” in 2017 IEEE international joint conference on biometrics (IJCB). IEEE, 2017, pp. 383–389.
- X. Niu, Z. Yu, H. Han, X. Li, S. Shan, and G. Zhao, “Video-based remote physiological measurement via cross-verified feature disentangling,” in Computer Vision–ECCV 2020: 16th European Conference, Glasgow, UK, August 23–28, 2020, Proceedings, Part II 16. Springer, 2020, pp. 295–310.
- C.-J. Hsieh, W.-H. Chung, and C.-T. Hsu, “Augmentation of rppg benchmark datasets: Learning to remove and embed rppg signals via double cycle consistent learning from unpaired facial videos,” in European Conference on Computer Vision. Springer, 2022, pp. 372–387.
- A. Zadeh, Y. Chong Lim, T. Baltrusaitis, and L.-P. Morency, “Convolutional experts constrained local model for 3d facial landmark detection,” in Proceedings of the IEEE International Conference on Computer Vision Workshops, 2017, pp. 2519–2528.
- K. Zhang, Z. Zhang, Z. Li, and Y. Qiao, “Joint face detection and alignment using multitask cascaded convolutional networks,” IEEE signal processing letters, vol. 23, no. 10, pp. 1499–1503, 2016.
- L. Niu, J. Speth, N. Vance, B. Sporrer, A. Czajka, and P. Flynn, “Full-body cardiovascular sensing with remote photoplethysmography,” in Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2023, pp. 5993–6003.
- J. Liu, J. Lian, J. C. Sprott, Q. Liu, and Y. Ma, “The butterfly effect in primary visual cortex,” IEEE Transactions on Computers, vol. 71, no. 11, pp. 2803–2815, 2022.
- Z. Yi, J. Lian, Q. Liu, H. Zhu, D. Liang, and J. Liu, “Learning rules in spiking neural networks: A survey,” Neurocomputing, vol. 531, pp. 163–179, 2023.
- W. Verkruysse, L. O. Svaasand, and J. S. Nelson, “Remote plethysmographic imaging using ambient light.” Optics express, vol. 16, no. 26, pp. 21 434–21 445, 2008.
- P. Welch, “The use of fast fourier transform for the estimation of power spectra: a method based on time averaging over short, modified periodograms,” IEEE Transactions on audio and electroacoustics, vol. 15, no. 2, pp. 70–73, 1967.
- X. Niu, S. Shan, H. Han, and X. Chen, “Rhythmnet: End-to-end heart rate estimation from face via spatial-temporal representation,” IEEE Transactions on Image Processing, vol. 29, pp. 2409–2423, 2019.