Deep Learning based acoustic measurement approach for robotic applications on orthopedics (2403.05879v1)
Abstract: In Total Knee Replacement Arthroplasty (TKA), surgical robotics can provide image-guided navigation to fit implants with high precision. Its tracking approach highly relies on inserting bone pins into the bones tracked by the optical tracking system. This is normally done by invasive, radiative manners (implantable markers and CT scans), which introduce unnecessary trauma and prolong the preparation time for patients. To tackle this issue, ultrasound-based bone tracking could offer an alternative. In this study, we proposed a novel deep learning structure to improve the accuracy of bone tracking by an A-mode ultrasound (US). We first obtained a set of ultrasound dataset from the cadaver experiment, where the ground truth locations of bones were calculated using bone pins. These data were used to train the proposed CasAtt-UNet to predict bone location automatically and robustly. The ground truth bone locations and those locations of US were recorded simultaneously. Therefore, we could label bone peaks in the raw US signals. As a result, our method achieved sub millimeter precision across all eight bone areas with the only exception of one channel in the ankle. This method enables the robust measurement of lower extremity bone positions from 1D raw ultrasound signals. It shows great potential to apply A-mode ultrasound in orthopedic surgery from safe, convenient, and efficient perspectives.
- C. Li, Z. Zhang, G. Wang, C. Rong, W. Zhu, X. Lu, Y. Liu, and H. Zhang, “Accuracies of bone resection, implant position, and limb alignment in robotic-arm-assisted total knee arthroplasty: a prospective single-centre study,” Journal of Orthopaedic Surgery and Research, vol. 17, no. 1, p. 61, 2022.
- T. B. Meier, N. A. Goldfarb, C. J. Nycz, and G. S. Fischer, “Evaluating knee exoskeleton design based on movement with respect to underlying bone structure using mri,” IEEE Transactions on Medical Robotics and Bionics, 2023.
- C. Fang, D. Wang, D. Song, and J. Zou, “The second generation (g2) fingertip sensor for near-distance ranging and material sensing in robotic grasping,” in 2022 International Conference on Robotics and Automation (ICRA). IEEE, 2022, pp. 1506–1512.
- K. Shi, R. Huang, F. Mu, Z. Peng, K. Huang, Y. Qin, X. Yang, and H. Cheng, “A novel multimodal human-exoskeleton interface based on eeg and semg activity for rehabilitation training,” in 2022 International Conference on Robotics and Automation (ICRA). IEEE, 2022, pp. 8076–8082.
- X. Chen, “Reconstruction individual three-dimensional model of fractured long bone based on feature points,” Computational and Applied Mathematics, vol. 39, no. 2, p. 131, 2020.
- C. Gebhardt, L. Göttling, L. Buchberger, C. Ziegler, F. Endres, Q. Wuermeling, B. M. Holzapfel, W. Wein, F. Wagner, and O. Zettinig, “Femur reconstruction in 3d ultrasound for orthopedic surgery planning,” International Journal of Computer Assisted Radiology and Surgery, pp. 1–8, 2023.
- O. Guinebretiere and J. Giles, “Feasability of a-mode ultrasound based registration to track scapula motion: A simulation study,” EPiC Series in Health Sciences, vol. 4, pp. 97–102, 2020.
- C. Zhang, Y. Liu, Y. Zhang, and H. Li, “A hybrid feature-based patient-to-image registration method for robot-assisted long bone osteotomy,” International Journal of Computer Assisted Radiology and Surgery, vol. 16, no. 9, pp. 1507–1516, 2021.
- C. Liu, Y. Song, X. Ma, and T. Sun, “Accurate and robust registration method for computer-assisted high tibial osteotomy surgery,” International Journal of Computer Assisted Radiology and Surgery, vol. 18, no. 2, pp. 329–337, 2023.
- K. Niu, V. Sluiter, J. Homminga, A. Sprengers, and N. Verdonschot, “A novel ultrasound-based lower extremity motion tracking system,” Intelligent Orthopaedics: Artificial Intelligence and Smart Image-guided Technology for Orthopaedics, pp. 131–142, 2018.
- K. Niu, J. Homminga, V. I. Sluiter, A. Sprengers, and N. Verdonschot, “Feasibility of a-mode ultrasound based intraoperative registration in computer-aided orthopedic surgery: A simulation and experimental study,” Plos One, vol. 13, no. 6, p. e0199136, 2018.
- K. Niu, T. Anijs, V. Sluiter, J. Homminga, A. Sprengers, M. A. Marra, and N. Verdonschot, “In situ comparison of a-mode ultrasound tracking system and skin-mounted markers for measuring kinematics of the lower extremity,” Journal of biomechanics, vol. 72, pp. 134–143, 2018.
- K. Niu, J. Homminga, V. Sluiter, A. Sprengers, and N. Verdonschot, “Measuring relative positions and orientations of the tibia with respect to the femur using one-channel 3d-tracked a-mode ultrasound tracking system: A cadaveric study,” Medical engineering & physics, vol. 57, pp. 61–68, 2018.
- V. Moskalenko, N. Zolotykh, and G. Osipov, “Deep learning for ecg segmentation,” in Advances in Neural Computation, Machine Learning, and Cognitive Research III: Selected Papers from the XXI International Conference on Neuroinformatics, October 7-11, 2019, Dolgoprudny, Moscow Region, Russia. Springer, 2020, pp. 246–254.
- P.-H. Chen, C.-H. Huang, W.-T. Chiu, C.-M. Liao, Y.-R. Lin, S.-K. Hung, L.-C. Chen, H.-L. Hsieh, W.-Y. Chiou, M.-S. Lee, et al., “A multiple organ segmentation system for ct image series using attention-lstm fused u-net,” Multimedia Tools and Applications, vol. 81, no. 9, pp. 11 881–11 895, 2022.
- H. Azhari, “Appendix a: Typical acoustic properties of tissues,” 2010.
- O. Oktay, J. Schlemper, L. L. Folgoc, M. Lee, M. Heinrich, K. Misawa, K. Mori, S. McDonagh, N. Y. Hammerla, B. Kainz, et al., “Attention u-net: Learning where to look for the pancreas,” arXiv preprint arXiv:1804.03999, 2018.