Terrain-Aware Stride-Level Trajectory Forecasting for a Powered Hip Exoskeleton via Vision and Kinematics Fusion (2404.11945v1)
Abstract: Powered hip exoskeletons have shown the ability for locomotion assistance during treadmill walking. However, providing suitable assistance in real-world walking scenarios which involve changing terrain remains challenging. Recent research suggests that forecasting the lower limb joint's angles could provide target trajectories for exoskeletons and prostheses, and the performance could be improved with visual information. In this letter, We share a real-world dataset of 10 healthy subjects walking through five common types of terrain with stride-level label. We design a network called Sandwich Fusion Transformer for Image and Kinematics (SFTIK), which predicts the thigh angle of the ensuing stride given the terrain images at the beginning of the preceding and the ensuing stride and the IMU time series during the preceding stride. We introduce width-level patchify, tailored for egocentric terrain images, to reduce the computational demands. We demonstrate the proposed sandwich input and fusion mechanism could significantly improve the forecasting performance. Overall, the SFTIK outperforms baseline methods, achieving a computational efficiency of 3.31 G Flops, and root mean square error (RMSE) of 3.445 \textpm \ 0.804\textdegree \ and Pearson's correlation coefficient (PCC) of 0.971 \textpm\ 0.025. The results demonstrate that SFTIK could forecast the thigh's angle accurately with low computational cost, which could serve as a terrain adaptive trajectory planning method for hip exoskeletons. Codes and data are available at https://github.com/RuoqiZhao116/SFTIK.
- C. Siviy, L. M. Baker, B. T. Quinlivan, F. Porciuncula, K. Swaminathan, L. N. Awad, and C. J. Walsh, “Opportunities and challenges in the development of exoskeletons for locomotor assistance,” Nat. Biomed. Eng., vol. 7, no. 4, pp. 456–472, 2023.
- C. Jayaraman, K. R. Embry, C. K. Mummidisetty, Y. Moon, M. Giffhorn, S. Prokup, B. Lim, J. Lee, Y. Lee, M. Lee, et al., “Modular hip exoskeleton improves walking function and reduces sedentary time in community-dwelling older adults,” J. Neuroeng. Rehabil., vol. 19, no. 1, p. 144, 2022.
- R. L. Medrano, G. C. Thomas, C. G. Keais, E. J. Rouse, and R. D. Gregg, “Real-time gait phase and task estimation for controlling a powered ankle exoskeleton on extremely uneven terrain,” IEEE Trans. Robot., vol. 39, no. 3, pp. 2170–2182, 2023.
- J. Camargo, A. Ramanathan, W. Flanagan, and A. Young, “A comprehensive, open-source dataset of lower limb biomechanics in multiple conditions of stairs, ramps, and level-ground ambulation and transitions,” J. Biomech., vol. 119, p. 110320, 2021.
- I. Kang, D. D. Molinaro, G. Choi, J. Camargo, and A. J. Young, “Subject-independent continuous locomotion mode classification for robotic hip exoskeleton applications,” IEEE Trans. Biomed. Eng., vol. 69, no. 10, pp. 3234–3242, 2022.
- M. Sharifi-Renani, M. H. Mahoor, and C. W. Clary, “Biomat: An open-source biomechanics multi-activity transformer for joint kinematic predictions using wearable sensors,” Sensors, vol. 23, no. 13, p. 5778, 2023.
- S. Zhao, Z. Yu, Z. Wang, H. Liu, Z. Zhou, L. Ruan, and Q. Wang, “A learning-free method for locomotion mode prediction by terrain reconstruction and visual-inertial odometry,” IEEE Trans. Neural Syst. Rehabil. Eng., vol. 31, pp. 3895–3905, 2023.
- Y. Qian, Y. Wang, C. Chen, J. Xiong, Y. Leng, H. Yu, and C. Fu, “Predictive locomotion mode recognition and accurate gait phase estimation for hip exoskeleton on various terrains,” IEEE Robot. Autom. Lett., vol. 7, no. 3, pp. 6439–6446, 2022.
- A. Sharma and E. Rombokas, “Optimizing representations of multiple simultaneous attributes for gait generation using deep learning,” IEEE Trans. Neural Syst. Rehabil. Eng., vol. 31, pp. 2296–2305, 2023.
- B. C. Wei, C. Z. Yi, S. P. Zhang, H. Guo, J. F. Zhu, Z. Ding, and F. Jiang, “Taking locomotion mode as prior: One algorithm-enabled gait events and kinematics prediction on various terrains,” IEEE Sens. J., vol. 23, no. 12, pp. 13 072–13 083, 2023.
- X. Zhang, H. Zhang, J. Hu, J. Deng, and Y. Wang, “Motion forecasting network (mofcnet): Imu-based human motion forecasting for hip assistive exoskeleton,” IEEE Robot. Autom. Lett., vol. 8, no. 9, pp. 5783–5790, 2023.
- S. P. Sitole and F. C. Sup, “Continuous prediction of human joint mechanics using emg signals: A review of model-based and model-free approaches,” IEEE Trans. Med. Robot. Bionics, vol. 5, no. 3, pp. 528–546, 2023.
- Y. Wang, X. Cheng, L. Jabban, X. Sui, and D. Zhang, “Motion intention prediction and joint trajectories generation toward lower limb prostheses using emg and imu signals,” IEEE Sens. J., vol. 22, no. 11, pp. 10 719–10 729, 2022.
- H. Xu and A. Xiong, “Advances and disturbances in semg-based intentions and movements recognition: A review,” IEEE Sens. J., vol. 21, no. 12, pp. 13 019–13 028, 2021.
- A. Sharma and E. Rombokas, “Improving imu-based prediction of lower limb kinematics in natural environments using egocentric optical flow,” IEEE Trans. Neural Syst. Rehabil. Eng., vol. 30, pp. 699–708, 2022.
- O. Tsepa, R. Burakov, B. Laschowski, A. Mihailidis, and Ieee, “Continuous prediction of leg kinematics during walking using inertial sensors, smart glasses, and embedded computing,” in Proc. IEEE Int. Conf. Robot. Autom., 2023, pp. 10 478–10 482.
- M. Shushtari, H. Dinovitzer, J. Weng, and A. Arami, “Ultra-robust real-time estimation of gait phase,” IEEE Trans. Neural Syst. Rehabil. Eng., vol. 30, pp. 2793–2801, 2022.
- J. Na, H. Kim, G. Lee, and W. Nam, “Deep domain adaptation, pseudo-labeling, and shallow network for accurate and fast gait prediction of unlabeled datasets,” IEEE Trans. Neural Syst. Rehabil. Eng., vol. 31, pp. 2448–2456, 2023.
- Y. Guo, Y. Hutabarat, D. Owaki, and M. Hayashibe, “Speed-variable gait phase estimation during ambulation via temporal convolutional network,” IEEE Sens. J., vol. 24, no. 4, pp. 5224–5236, 2024.
- I. Kang, D. D. Molinaro, S. Duggal, Y. Chen, P. Kunapuli, and A. J. Young, “Real-time gait phase estimation for robotic hip exoskeleton control during multimodal locomotion,” IEEE Robot. Autom. Lett., vol. 6, no. 2, pp. 3491–3497, 2021.
- P. Xu, X. T. Zhu, and D. A. Clifton, “Multimodal learning with transformers: A survey,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 45, no. 10, pp. 12 113–12 132, 2023.
- A. Dosovitskiy, L. Beyer, A. Kolesnikov, D. Weissenborn, X. Zhai, T. Unterthiner, M. Dehghani, M. Minderer, G. Heigold, and S. Gelly, “An image is worth 16x16 words: Transformers for image recognition at scale,” in Proc. Int. Conf. Learn. Represent., 2020.
- Y. Nie, N. H. Nguyen, P. Sinthong, and J. Kalagnanam, “A time series is worth 64 words: Long-term forecasting with transformers,” in Proc. Int. Conf. Learn. Represent., 2023.
- F. Weigand, A. Höhl, J. Zeiss, U. Konigorski, and M. Grimmer, “Continuous locomotion mode recognition and gait phase estimation based on a shank-mounted imu with artificial neural networks,” in IEEE/RSJ Proc. Int. Conf. Intell. Robots Syst., 2022, pp. 12 744–12 751.
Sponsor
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.