Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
125 tokens/sec
GPT-4o
47 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Beyond Gait: Learning Knee Angle for Seamless Prosthesis Control in Multiple Scenarios (2404.06772v1)

Published 10 Apr 2024 in cs.RO

Abstract: Deep learning models have become a powerful tool in knee angle estimation for lower limb prostheses, owing to their adaptability across various gait phases and locomotion modes. Current methods utilize Multi-Layer Perceptrons (MLP), Long-Short Term Memory Networks (LSTM), and Convolutional Neural Networks (CNN), predominantly analyzing motion information from the thigh. Contrary to these approaches, our study introduces a holistic perspective by integrating whole-body movements as inputs. We propose a transformer-based probabilistic framework, termed the Angle Estimation Probabilistic Model (AEPM), that offers precise angle estimations across extensive scenarios beyond walking. AEPM achieves an overall RMSE of 6.70 degrees, with an RMSE of 3.45 degrees in walking scenarios. Compared to the state of the art, AEPM has improved the prediction accuracy for walking by 11.31%. Our method can achieve seamless adaptation between different locomotion modes. Also, this model can be utilized to analyze the synergy between the knee and other joints. We reveal that the whole body movement has valuable information for knee movement, which can provide insights into designing sensors for prostheses. The code is available at https://github.com/penway/Beyond-Gait-AEPM.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (33)
  1. D. Amtmann, S. J. Morgan, J. Kim, and B. J. Hafner, “Health-related profiles of people with lower limb loss,” Archives of Physical Medicine and Rehabilitation, vol. 96, no. 8, pp. 1474–1483, 2015.
  2. K. Ziegler-Graham, E. J. MacKenzie, P. L. Ephraim, T. G. Travison, and R. Brookmeyer, “Estimating the prevalence of limb loss in the united states: 2005 to 2050,” Archives of Physical Medicine and Rehabilitation, vol. 89, no. 3, pp. 422–429, 2008.
  3. M. Windrich, M. Grimmer, O. Christ, S. Rinderknecht, and P. Beckerle, “Active lower limb prosthetics: a systematic review of design issues and solutions,” Biomedical Engineering Online, vol. 15, pp. 5–19, 2016.
  4. F. Sup, A. Bohara, and M. Goldfarb, “Design and control of a powered transfemoral prosthesis,” The International Journal of Robotics Research, vol. 27, no. 2, pp. 263–273, 2008.
  5. N. Hogan, “Impedance control: An approach to manipulation,” in 1984 American Control Conference.   IEEE, 1984, pp. 304–313.
  6. H. Huang, D. L. Crouch, M. Liu, G. S. Sawicki, and D. Wang, “A cyber expert system for auto-tuning powered prosthesis impedance control parameters,” Annals of Biomedical Engineering, vol. 44, pp. 1613–1624, 2016.
  7. A. M. Simon, K. A. Ingraham, N. P. Fey, S. B. Finucane, R. D. Lipschutz, A. J. Young, and L. J. Hargrove, “Configuring a powered knee and ankle prosthesis for transfemoral amputees within five specific ambulation modes,” PloS One, vol. 9, no. 6, p. e99387, 2014.
  8. D. Dong, C. Ma, M. Wang, H. T. Vu, B. Vanderborght, and Y. Sun, “A low-cost framework for the recognition of human motion gait phases and patterns based on multi-source perception fusion,” Engineering Applications of Artificial Intelligence, vol. 120, p. 105886, 2023.
  9. D. Quintero, D. J. Lambert, D. J. Villarreal, and R. D. Gregg, “Real-time continuous gait phase and speed estimation from a single sensor,” in 2017 IEEE Conference on Control Technology and Applications (CCTA).   IEEE, 2017, pp. 847–852.
  10. C. Shen, Z. Pei, W. Chen, W. Duan, J. Wang, and J. Chen, “Simultaneous gait event intention detection using single semg sensor for lower limb exoskeleton,” in IECON 2023-49th Annual Conference of the IEEE Industrial Electronics Society.   IEEE, 2023, pp. 1–6.
  11. B. J. Stetter, F. C. Krafft, S. Ringhof, T. Stein, and S. Sell, “A machine learning and wearable sensor based approach to estimate external knee flexion and adduction moments during various locomotion tasks,” Frontiers in Bioengineering and Biotechnology, vol. 8, p. 9, 2020.
  12. I. Kang, D. D. Molinaro, S. Duggal, Y. Chen, P. Kunapuli, and A. J. Young, “Real-time gait phase estimation for robotic hip exoskeleton control during multimodal locomotion,” IEEE Robotics and Automation Letters, vol. 6, no. 2, pp. 3491–3497, 2021.
  13. X. Wu, Y. Yuan, X. Zhang, C. Wang, T. Xu, and D. Tao, “Gait phase classification for a lower limb exoskeleton system based on a graph convolutional network model,” IEEE Transactions on Industrial Electronics, vol. 69, no. 5, pp. 4999–5008, 2021.
  14. S. Cai, D. Chen, B. Fan, M. Du, G. Bao, and G. Li, “Gait phases recognition based on lower limb semg signals using lda-pso-lstm algorithm,” Biomedical Signal Processing and Control, vol. 80, p. 104272, 2023.
  15. Z. Guo, H. Zheng, H. Wu, J. Zhang, G. Zhou, and J. Long, “Transferable multi-modal fusion in knee angles and gait phases for their continuous prediction,” Journal of Neural Engineering, vol. 20, no. 3, p. 036019, 2023.
  16. H. Jeon and D. Lee, “Bi-directional long short-term memory-based gait phase recognition method robust to directional variations in subject’s gait progression using wearable inertial sensor,” Sensors, vol. 24, no. 4, p. 1276, 2024.
  17. X. Chen, S. Cai, L. Yu, X. Li, B. Fan, M. Du, T. Liu, and G. Bao, “A novel cnn-bilstm ensemble model with attention mechanism for sit-to-stand phase identification using wearable inertial sensors.” IEEE Transactions on Neural Systems and Rehabilitation Engineering, 2024.
  18. C. P. Nuesslein and A. J. Young, “A deep learning framework for end-to-end control of powered prostheses,” IEEE Robotics and Automation Letters, 2024.
  19. G. Ding, I. Georgilas, and A. Plummer, “A deep learning model with a self-attention mechanism for leg joint angle estimation across varied locomotion modes,” Sensors, vol. 24, no. 1, p. 211, 2023.
  20. A. Seyfarth, G. Zhao, and H. Jörntell, “Whole body coordination for self-assistance in locomotion,” Frontiers in Neurorobotics, vol. 16, p. 883641, 2022.
  21. M. A. Sharbafi and A. Seyfarth, “How locomotion sub-functions can control walking at different speeds?” Journal of Biomechanics, vol. 53, pp. 163–170, 2017.
  22. A. Vaswani, N. Shazeer, N. Parmar, J. Uszkoreit, L. Jones, A. N. Gomez, Ł. Kaiser, and I. Polosukhin, “Attention is all you need,” Advances in Neural Information Processing Systems, vol. 30, 2017.
  23. E. Aksan, M. Kaufmann, P. Cao, and O. Hilliges, “A spatio-temporal transformer for 3d human motion prediction,” in 2021 International Conference on 3D Vision (3DV).   IEEE, 2021, pp. 565–574.
  24. A. Martínez-González, M. Villamizar, and J.-M. Odobez, “Pose transformers (potr): Human motion prediction with non-autoregressive transformers,” in Proceedings of the IEEE/CVF International Conference on Computer Vision, 2021, pp. 2276–2284.
  25. C. Zheng, S. Zhu, M. Mendieta, T. Yang, C. Chen, and Z. Ding, “3d human pose estimation with spatial and temporal transformers,” in Proceedings of the IEEE/CVF International Conference on Computer Vision, 2021, pp. 11 656–11 665.
  26. J. Zhang, Z. Tu, J. Yang, Y. Chen, and J. Yuan, “Mixste: Seq2seq mixed spatio-temporal encoder for 3d human pose estimation in video,” in Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2022, pp. 13 232–13 242.
  27. Y. Huang, M. Kaufmann, E. Aksan, M. J. Black, O. Hilliges, and G. Pons-Moll, “Deep inertial poser: Learning to reconstruct human pose from sparse inertial measurements in real time,” ACM Transactions on Graphics (TOG), vol. 37, no. 6, pp. 1–15, 2018.
  28. W. Mao, C. Xu, Q. Zhu, S. Chen, and Y. Wang, “Leapfrog diffusion model for stochastic trajectory prediction,” in Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2023, pp. 5517–5526.
  29. C. Ionescu, D. Papava, V. Olaru, and C. Sminchisescu, “Human3. 6m: Large scale datasets and predictive methods for 3d human sensing in natural environments,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 36, no. 7, pp. 1325–1339, 2013.
  30. C. G. Lab, “Carnegie-Mellon Motion Capture (MoCap) Database,” 2003. [Online]. Available: http://mocap.cs.cmu.edu/info.php
  31. I. Loshchilov and F. Hutter, “Decoupled weight decay regularization,” arXiv preprint arXiv:1711.05101, 2017.
  32. M. Eslamy, F. Oswald, and A. F. Schilling, “Estimation of knee angles based on thigh motion: A functional approach and implications for high-level controlling of active prosthetic knees,” IEEE Control Systems Magazine, vol. 40, no. 3, pp. 49–61, 2020.
  33. F.-Y. Liang, F. Gao, and W.-H. Liao, “Synergy-based knee angle estimation using kinematics of thigh,” Gait & Posture, vol. 89, pp. 25–30, 2021.

Summary

We haven't generated a summary for this paper yet.