A Recurrent Neural Network Enhanced Unscented Kalman Filter for Human Motion Prediction (2402.13045v1)
Abstract: This paper presents a deep learning enhanced adaptive unscented Kalman filter (UKF) for predicting human arm motion in the context of manufacturing. Unlike previous network-based methods that solely rely on captured human motion data, which is represented as bone vectors in this paper, we incorporate a human arm dynamic model into the motion prediction algorithm and use the UKF to iteratively forecast human arm motions. Specifically, a Lagrangian-mechanics-based physical model is employed to correlate arm motions with associated muscle forces. Then a Recurrent Neural Network (RNN) is integrated into the framework to predict future muscle forces, which are transferred back to future arm motions based on the dynamic model. Given the absence of measurement data for future human motions that can be input into the UKF to update the state, we integrate another RNN to directly predict human future motions and treat the prediction as surrogate measurement data fed into the UKF. A noteworthy aspect of this study involves the quantification of uncertainties associated with both the data-driven and physical models in one unified framework. These quantified uncertainties are used to dynamically adapt the measurement and process noises of the UKF over time. This adaption, driven by the uncertainties of the RNN models, addresses inaccuracies stemming from the data-driven model and mitigates discrepancies between the assumed and true physical models, ultimately enhancing the accuracy and robustness of our predictions. Compared to the traditional RNN-based prediction, our method demonstrates improved accuracy and robustness in extensive experimental validations of various types of human motions.
- A. Weiss, A.-K. Wortmeier, and B. Kubicek, “Cobots in industry 4.0: A roadmap for future practice studies on human–robot collaboration,” IEEE Transactions on Human-Machine Systems, vol. 51, no. 4, pp. 335–345, 2021.
- M.-L. Lee, X. Liang, B. Hu, G. Onel, S. Behdad, and M. Zheng, “A review of prospects and opportunities in disassembly with human–robot collaboration,” Journal of Manufacturing Science and Engineering, vol. 146, no. 2, 2024.
- W. Liu, X. Liang, and M. Zheng, “Dynamic model informed human motion prediction based on unscented kalman filter,” IEEE/ASME Transactions on Mechatronics, vol. 27, no. 6, pp. 5287–5295, 2022.
- K. A. Eltouny, W. Liu, S. Tian, M. Zheng, and X. Liang, “De-tgn: Uncertainty-aware human motion forecasting using deep ensembles,” arXiv preprint arXiv:2307.03610, 2023.
- S. Tian, X. Liang, and M. Zheng, “An optimization-based human behavior modeling and prediction for human-robot collaborative disassembly,” in 2023 American Control Conference (ACC). IEEE, 2023, pp. 3356–3361.
- S. Tian, M. Zheng, and X. Liang, “Transfusion: A practical and effective transformer-based diffusion model for 3d human motion prediction,” arXiv preprint arXiv:2307.16106, 2023.
- J. Wang, A. Hertzmann, and D. J. Fleet, “Gaussian process dynamical models,” Advances in neural information processing systems, vol. 18, 2005.
- H. Ding, G. Reißig, K. Wijaya, D. Bortot, K. Bengler, and O. Stursberg, “Human arm motion modeling and long-term prediction for safe and efficient human-robot-interaction,” in 2011 IEEE International Conference on Robotics and Automation. IEEE, 2011, pp. 5875–5880.
- K. Fragkiadaki, S. Levine, P. Felsen, and J. Malik, “Recurrent network models for human dynamics,” in Proceedings of the IEEE international conference on computer vision, 2015, pp. 4346–4354.
- J. Martinez, M. J. Black, and J. Romero, “On human motion prediction using recurrent neural networks,” in Proceedings of the IEEE conference on computer vision and pattern recognition, 2017, pp. 2891–2900.
- D. Pavllo, D. Grangier, and M. Auli, “Quaternet: A quaternion-based recurrent model for human motion,” arXiv preprint arXiv:1805.06485, 2018.
- A. Jain, A. R. Zamir, S. Savarese, and A. Saxena, “Structural-rnn: Deep learning on spatio-temporal graphs,” in Proceedings of the ieee conference on computer vision and pattern recognition, 2016, pp. 5308–5317.
- E. Aksan, M. Kaufmann, P. Cao, and O. Hilliges, “A spatio-temporal transformer for 3d human motion prediction,” in 2021 International Conference on 3D Vision (3DV). IEEE, 2021, pp. 565–574.
- Y. Cai, L. Huang, Y. Wang, T.-J. Cham, J. Cai, J. Yuan, J. Liu, X. Yang, Y. Zhu, X. Shen et al., “Learning progressive joint propagation for human motion prediction,” in Computer Vision–ECCV 2020: 16th European Conference, Glasgow, UK, August 23–28, 2020, Proceedings, Part VII 16. Springer, 2020, pp. 226–242.
- L. Dang, Y. Nie, C. Long, Q. Zhang, and G. Li, “Msr-gcn: Multi-scale residual graph convolution networks for human motion prediction,” in Proceedings of the IEEE/CVF International Conference on Computer Vision, 2021, pp. 11 467–11 476.
- M. Li, S. Chen, Y. Zhao, Y. Zhang, Y. Wang, and Q. Tian, “Dynamic multiscale graph neural networks for 3d skeleton based human motion prediction,” in Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, 2020, pp. 214–223.
- Z. Liu, S. Wu, S. Jin, Q. Liu, S. Lu, R. Zimmermann, and L. Cheng, “Towards natural and accurate future motion prediction of humans and animals,” in Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2019, pp. 10 004–10 012.
- J. Hu, Z. Fan, J. Liao, and L. Liu, “Predicting long-term skeletal motions by a spatio-temporal hierarchical recurrent network,” arXiv preprint arXiv:1911.02404, 2019.
- L.-Y. Gui, Y.-X. Wang, X. Liang, and J. M. Moura, “Adversarial geometry-aware human motion prediction,” in Proceedings of the european conference on computer vision (ECCV), 2018, pp. 786–803.
- Z. Li, J. Sedlar, J. Carpentier, I. Laptev, N. Mansard, and J. Sivic, “Estimating 3d motion and forces of person-object interactions from monocular video,” in Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2019, pp. 8640–8649.
- X. Lv, J. Chai, and S. Xia, “Data-driven inverse dynamics for human motion,” ACM Transactions on Graphics (TOG), vol. 35, no. 6, pp. 1–12, 2016.
- S. Cao and R. Nevatia, “Forecasting human pose and motion with multibody dynamic model,” in 2015 IEEE Winter Conference on Applications of Computer Vision. IEEE, 2015, pp. 191–198.
- I.-C. Chang and S.-Y. Lin, “3d human motion tracking based on a progressive particle filter,” Pattern Recognition, vol. 43, no. 10, pp. 3621–3635, 2010.
- K. Reif and R. Unbehauen, “The extended kalman filter as an exponential observer for nonlinear systems,” IEEE Transactions on Signal processing, vol. 47, no. 8, pp. 2324–2328, 1999.
- S. J. Julier and J. K. Uhlmann, “Unscented filtering and nonlinear estimation,” Proceedings of the IEEE, vol. 92, no. 3, pp. 401–422, 2004.
- F. Gustafsson and G. Hendeby, “Some relations between extended and unscented kalman filters,” IEEE Transactions on Signal Processing, vol. 60, no. 2, pp. 545–555, 2011.
- D. Lee, C. Liu, Y.-W. Liao, and J. K. Hedrick, “Parallel interacting multiple model-based human motion prediction for motion planning of companion robots,” IEEE Transactions on Automation Science and Engineering, vol. 14, no. 1, pp. 52–61, 2016.
- Z. Wang, S. Liu, and Y. Xu, “Human motion prediction based on hybrid motion model,” in 2017 IEEE International Conference on Information and Automation (ICIA). IEEE, 2017, pp. 942–946.
- A. Atrsaei, H. Salarieh, and A. Alasty, “Human arm motion tracking by orientation-based fusion of inertial sensors and kinect using unscented kalman filter,” Journal of biomechanical engineering, vol. 138, no. 9, p. 091005, 2016.
- S. Soltani, M. Kordestani, P. K. Aghaee, and M. Saif, “Improved estimation for well-logging problems based on fusion of four types of kalman filters,” IEEE Transactions on Geoscience and Remote Sensing, vol. 56, no. 2, pp. 647–654, 2017.
- B. Zheng, P. Fu, B. Li, and X. Yuan, “A robust adaptive unscented kalman filter for nonlinear estimation with uncertain noise covariance,” Sensors, vol. 18, no. 3, p. 808, 2018.
- M.-L. Lee, S. Behdad, X. Liang, and M. Zheng, “Task allocation and planning for product disassembly with human–robot collaboration,” Robotics and Computer-Integrated Manufacturing, vol. 76, p. 102306, 2022.
- M.-L. Lee, W. Liu, S. Behdad, X. Liang, and M. Zheng, “Robot-assisted disassembly sequence planning with real-time human motion prediction,” IEEE Transactions on Systems, Man, and Cybernetics: Systems, vol. 53, no. 1, pp. 438–450, 2022.
- W. Liu, X. Liang, and M. Zheng, “Task-constrained motion planning considering uncertainty-informed human motion prediction for human–robot collaborative disassembly,” IEEE/ASME Transactions on Mechatronics, 2023.
- G. Franchi, A. Bursuc, E. Aldea, S. Dubuisson, and I. Bloch, “Encoding the latent posterior of bayesian neural networks for uncertainty quantification,” arXiv preprint arXiv:2012.02818, 2020.
- B. Lakshminarayanan, A. Pritzel, and C. Blundell, “Simple and scalable predictive uncertainty estimation using deep ensembles,” Advances in neural information processing systems, vol. 30, 2017.
- Y. Gal and Z. Ghahramani, “Dropout as a bayesian approximation: Representing model uncertainty in deep learning,” in international conference on machine learning. PMLR, 2016, pp. 1050–1059.
- A. Graves, “Practical variational inference for neural networks,” Advances in neural information processing systems, vol. 24, 2011.
- Y. Gal and Z. Ghahramani, “Bayesian convolutional neural networks with bernoulli approximate variational inference,” arXiv preprint arXiv:1506.02158, 2015.
- A. Kendall and Y. Gal, “What uncertainties do we need in bayesian deep learning for computer vision?” Advances in neural information processing systems, vol. 30, 2017.
- S. J. Julier and J. K. Uhlmann, “New extension of the kalman filter to nonlinear systems,” in Signal processing, sensor fusion, and target recognition VI, vol. 3068. Spie, 1997, pp. 182–193.
Sponsor
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.