Sensory Glove-Based Surgical Robot User Interface (2403.13941v2)
Abstract: Robotic surgery has reached a high level of maturity and has become an integral part of standard surgical care. However, existing surgeon consoles are bulky, take up valuable space in the operating room, make surgical team coordination challenging, and their proprietary nature makes it difficult to take advantage of recent technological advances, especially in virtual and augmented reality. One potential area for further improvement is the integration of modern sensory gloves into robotic platforms, allowing surgeons to control robotic arms intuitively with their hand movements. We propose one such system that combines an HTC Vive tracker, a Manus Meta Prime 3 XR sensory glove, and SCOPEYE wireless smart glasses. The system controls one arm of a da Vinci surgical robot. In addition to moving the arm, the surgeon can use fingers to control the end-effector of the surgical instrument. Hand gestures are used to implement clutching and similar functions. In particular, we introduce clutching of the instrument orientation, a functionality unavailable in the da Vinci system. The vibrotactile elements of the glove are used to provide feedback to the user when gesture commands are invoked. A qualitative and quantitative evaluation has been conducted that compares the current device with the dVRK console. The system is shown to have excellent tracking accuracy, and the new interface allows surgeons to perform common surgical training tasks with minimal practice efficiently.
- F. Kanji, T. Cohen, M. Alfred, A. Caron, S. Lawton, S. Savage, D. Shouhed, J. T. Anger, and K. Catchpole, “Room size influences flow in robotic-assisted surgery,” International Journal of Environmental Research and Public Health, vol. 18, no. 15, p. 7984, 2021.
- R. Bharathan, R. Aggarwal, and A. Darzi, “Operating room of the future,” Best Practice & Research Clinical Obstetrics & Gynaecology, vol. 27, no. 3, pp. 311–322, 2013.
- W. S. Sandberg, B. Daily, M. Egan, J. E. Stahl, J. M. Goldman, R. A. Wiklund, and D. Rattner, “Deliberate perioperative systems design improves operating room throughput,” The Journal of the American Society of Anesthesiologists, vol. 103, no. 2, pp. 406–418, 2005.
- J. E. Stahl, W. S. Sandberg, B. Daily, R. Wiklund, M. T. Egan, J. M. Goldman, K. B. Isaacson, S. Gazelle, and D. W. Rattner, “Reorganizing patient care and workflow in the operating room: a cost-effectiveness study,” Surgery, vol. 139, no. 6, pp. 717–728, 2006.
- A. Simorov, R. S. Otte, C. M. Kopietz, and D. Oleynikov, “Review of surgical robotics user interface: What is the best way to control robotic surgery?” Surgical Endoscopy, vol. 26, no. 8, pp. 2117–2125, Aug. 2012.
- L. Cofran, T. Cohen, M. Alfred, F. Kanji, E. Choi, S. Savage, J. Anger, and K. Catchpole, “Barriers to safety and efficiency in robotic surgery docking,” Surgical endoscopy, pp. 1–10, 2021.
- N. Hong, M. Kim, C. Lee, and S. Kim, “Head-mounted interface for intuitive vision control and continuous surgical operation in a surgical robot system,” Medical & Biological Engineering & Computing, vol. 57, no. 3, pp. 601–614, Mar. 2019.
- R. Wen, W.-L. Tay, B. P. Nguyen, C.-B. Chng, and C.-K. Chui, “Hand gesture guided robot-assisted surgery based on a direct augmented reality interface,” Computer Methods and Programs in Biomedicine, vol. 116, no. 2, pp. 68–80, Sep. 2014.
- X. MercilinRaajini, S. Abirami, S. Keerthana, N. Shrivastava, N. Malokhat, and R. Yokeshwaran, “Mem based hand gesture controlled wireless robot,” in E3S Web of Conferences, vol. 399. EDP Sciences, 2023, p. 01013.
- M. K. Burns and R. Vinjamuri, “Design of a soft glove-based robotic hand exoskeleton with embedded synergies,” Advances in Motor Neuroprostheses, pp. 71–87, 2020.
- A. Brygo, I. Sarakoglou, G. Grioli, and N. Tsagarakis, “Synergy-based bilateral port: A universal control module for tele-manipulation frameworks using asymmetric master–slave systems,” Frontiers in bioengineering and biotechnology, vol. 5, p. 19, 2017.
- P. Kazanzides, Z. Chen, A. Deguet, G. S. Fischer, R. H. Taylor, and S. P. DiMaio, “An open-source research kit for the da vinci surgical system,” in IEEE Intl. Conf. on Robotics and Auto. (ICRA), Hong Kong, China, 2014, pp. 6434–6439.
- Open Source Robotics Foundation, “Robot operating system.” [Online]. Available: https://www.ros.org
- K.-H. Oh, L. Borgioli, M. Zefran, L. Chen, and P. C. Giulianotti, “A framework for automated dissection along tissue boundary,” 2023. [Online]. Available: https://arxiv.org/abs/2310.09669
- O. Kreylos, “Lighthouse tracking examined,” Doc-ok. org, vol. 25, p. 12, 2016.
- D. C. Niehorster, L. Li, and M. Lappe, “The accuracy and precision of position and orientation tracking in the htc vive virtual reality system for scientific research,” i-Perception, vol. 8, no. 3, p. 2041669517708205, 2017.
- V. Bazarevsky, Y. Golubev, Y. Kartynnik et al., “Mediapipe: A framework for building perception pipes,” arXiv preprint arXiv:1906.08172, 2019.
- K.-H. Oh, L. Borgioli, A. Mangano, V. Valle, M. D. Pangrazio, F. Toti, G. Pozza, L. Ambrosini, A. Ducas, M. Zefran, L. Chen, and P. C. Giulianotti, “Comprehensive robotic cholecystectomy dataset (crcd): Integrating kinematics, pedal signals, and endoscopic videos,” 2023. [Online]. Available: https://arxiv.org/abs/2312.01183
- L. Breiman, “Random forests,” Machine learning, vol. 45, pp. 5–32, 2001.
- G. E. Hinton, “Connectionist learning procedures,” in Machine learning. Elsevier, 1990, pp. 555–610.
- G. Ke, Q. Meng, T. Finley, T. Wang, W. Chen, W. Ma, Q. Ye, and T.-Y. Liu, “Lightgbm: A highly efficient gradient boosting decision tree,” in Advances in Neural Information Processing Systems, I. Guyon, U. V. Luxburg, S. Bengio, H. Wallach, R. Fergus, S. Vishwanathan, and R. Garnett, Eds., vol. 30. Curran Associates, Inc., 2017. [Online]. Available: https://proceedings.neurips.cc/paper_files/paper/2017/file/6449f44a102fde848669bdd9eb6b76fa-Paper.pdf
- J. H. Friedman, “Stochastic gradient boosting,” Computational statistics & data analysis, vol. 38, no. 4, pp. 367–378, 2002.
- A. M. Derossis, G. M. Fried, M. Abrahamowicz, H. H. Sigman, J. S. Barkun, and J. L. Meakins, “Development of a model for training and evaluation of laparoscopic skills 11this work was supported by an educational grant from united states surgical corporation (auto suture canada).” The American Journal of Surgery, vol. 175, no. 6, pp. 482–487, 1998. [Online]. Available: https://www.sciencedirect.com/science/article/pii/S0002961098000804
- M. Žefran, V. Kumar, and C. Croke, “Metrics and Connections for Rigid-Body Kinematics,” The International Journal of Robotics Research, vol. 18, no. 2, pp. 242–1, Feb. 1999.
- L. Borgioli, K.-H. Oh et al., “Sensory glove-based surgical robot user interface,” 2024. [Online]. Available: https://uofi.box.com/s/lnslb5xqvzvok3p0gcy01er1o6dwhqi0
- M. Schrum, M. Ghuy, E. Hedlund-Botti, M. Natarajan, M. Johnson, and M. Gombolay, “Concerning trends in likert scale usage in human-robot interaction: Towards improving best practices,” J. Hum.-Robot Interact., vol. 12, no. 3, apr 2023. [Online]. Available: https://doi.org/10.1145/3572784
- F. Pedregosa, G. Varoquaux, A. Gramfort, V. Michel, B. Thirion, O. Grisel, M. Blondel, P. Prettenhofer, R. Weiss, V. Dubourg, J. Vanderplas, A. Passos, D. Cournapeau, M. Brucher, M. Perrot, and E. Duchesnay, “Scikit-learn: Machine learning in Python,” Journal of Machine Learning Research, vol. 12, pp. 2825–2830, 2011.
- D. Zhang, J. Liu, L. Zhang, and G.-Z. Yang, “Hamlyn crm: A compact master manipulator for surgical robot remote control,” International journal of computer assisted radiology and surgery, vol. 15, pp. 503–514, 2020.
- Y. Sheng, H. Cheng, Y. Wang, H. Zhao, and H. Ding, “Teleoperated surgical robot with adaptive interactive control architecture for tissue identification,” Bioengineering (Basel), vol. 10, no. 10, p. 1157, Oct 2023.
- H. Sakoe and S. Chiba, “Dynamic programming algorithm optimization for spoken word recognition,” IEEE Transactions on Acoustics, Speech, and Signal Processing, vol. 26, no. 1, pp. 43–49, 1978.
- S. Calinon, D. Bruno, M. S. Malekzadeh, T. Nanayakkara, and D. G. Caldwell, “Human–robot skills transfer interfaces for a flexible surgical robot,” Computer Methods and Programs in Biomedicine, vol. 116, no. 2, pp. 81–96, Sep. 2014.
- A. D. Greer, P. M. Newhook, and G. R. Sutherland, “Human–Machine Interface for Robotic Surgery and Stereotaxy,” IEEE/ASME Transactions on Mechatronics, vol. 13, no. 3, pp. 355–361, Jun. 2008.
- L. Prokhorenko, D. Klimov, D. Mishchenkov, and Y. Poduraev, “Surgeon–robot interface development framework,” Computers in Biology and Medicine, vol. 120, p. 103717, May 2020.
- T. Yamamoto, N. Abolhassani, S. Jung, A. M. Okamura, and T. N. Judkins, “Augmented reality and haptic interfaces for robot-assisted surgery,” The International Journal of Medical Robotics and Computer Assisted Surgery, vol. 8, no. 1, pp. 45–56, 2012.