Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
12 tokens/sec
GPT-4o
12 tokens/sec
Gemini 2.5 Pro Pro
41 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
37 tokens/sec
DeepSeek R1 via Azure Pro
33 tokens/sec
2000 character limit reached

Robust Surgical Tool Tracking with Pixel-based Probabilities for Projected Geometric Primitives (2403.04971v1)

Published 8 Mar 2024 in cs.RO and cs.CV

Abstract: Controlling robotic manipulators via visual feedback requires a known coordinate frame transformation between the robot and the camera. Uncertainties in mechanical systems as well as camera calibration create errors in this coordinate frame transformation. These errors result in poor localization of robotic manipulators and create a significant challenge for applications that rely on precise interactions between manipulators and the environment. In this work, we estimate the camera-to-base transform and joint angle measurement errors for surgical robotic tools using an image based insertion-shaft detection algorithm and probabilistic models. We apply our proposed approach in both a structured environment as well as an unstructured environment and measure to demonstrate the efficacy of our methods.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (50)
  1. M. Yip and N. Das, “Robot autonomy for surgery,” in The Encyclopedia of MEDICAL ROBOTICS: Volume 1 Minimally Invasive Surgical Robotics.   World Scientific, 2019, pp. 281–313.
  2. A. Wilcox, J. Kerr, B. Thananjeyan, J. Ichnowski, M. Hwang, S. Paradis, D. Fer, and K. Goldberg, “Learning to localize, grasp, and hand over unmodified surgical needles,” in 2022 International Conference on Robotics and Automation (ICRA).   IEEE, 2022, pp. 9637–9643.
  3. A. Murali, S. Sen, B. Kehoe, A. Garg, S. McFarland, S. Patil, W. D. Boyd, S. Lim, P. Abbeel, and K. Goldberg, “Learning by observation for surgical subtasks: Multilateral cutting of 3d viscoelastic and 2d orthotropic tissue phantoms,” in 2015 IEEE International Conference on Robotics and Automation (ICRA).   IEEE, 2015, pp. 1202–1209.
  4. B. Thananjeyan, A. Garg, S. Krishnan, C. Chen, L. Miller, and K. Goldberg, “Multilateral surgical pattern cutting in 2d orthotropic gauze with deep reinforcement learning policies for tensioning,” in 2017 IEEE international conference on robotics and automation (ICRA).   IEEE, 2017, pp. 2371–2378.
  5. F. Richter, S. Shen, F. Liu, J. Huang, E. K. Funk, R. K. Orosco, and M. C. Yip, “Autonomous robotic suction to clear the surgical field for hemostasis using image-based blood flow detection,” IEEE Robotics and Automation Letters, vol. 6, no. 2, pp. 1383–1390, 2021.
  6. M. Hwang, D. Seita, B. Thananjeyan, J. Ichnowski, S. Paradis, D. Fer, T. Low, and K. Goldberg, “Applying depth-sensing to automated surgical manipulation with a da vinci robot,” in 2020 International Symposium on Medical Robotics (ISMR).   IEEE, 2020, pp. 22–29.
  7. Z.-Y. Chiu, F. Richter, E. K. Funk, R. K. Orosco, and M. C. Yip, “Bimanual regrasping for suture needles using reinforcement learning for rapid motion planning,” in 2021 IEEE International Conference on Robotics and Automation (ICRA).   IEEE, 2021, pp. 7737–7743.
  8. Z.-Y. Chiu, A. Z. Liao, F. Richter, B. Johnson, and M. C. Yip, “Markerless suture needle 6d pose tracking with robust uncertainty estimation for autonomous minimally invasive robotic surgery,” in 2022 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).   IEEE, 2022, pp. 5286–5292.
  9. J. Lu, F. Richter, and M. C. Yip, “Markerless camera-to-robot pose estimation via self-supervised sim-to-real transfer,” in Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2023, pp. 21 296–21 306.
  10. S. Lin, A. J. Miao, J. Lu, S. Yu, Z.-Y. Chiu, F. Richter, and M. C. Yip, “Semantic-super: A semantic-aware surgical perception framework for endoscopic tissue classification, reconstruction, and tracking,” arXiv preprint arXiv:2210.16674, 2022.
  11. F. Richter, J. Lu, R. K. Orosco, and M. C. Yip, “Robotic tool tracking under partially visible kinematic chain: A unified approach,” IEEE Transactions on Robotics, vol. 38, no. 3, pp. 1653–1670, 2021.
  12. M. Hwang, B. Thananjeyan, S. Paradis, D. Seita, J. Ichnowski, D. Fer, T. Low, and K. Goldberg, “Efficiently calibrating cable-driven surgical robots with rgbd fiducial sensing and recurrent neural networks,” IEEE Robotics and Automation Letters, vol. 5, no. 4, pp. 5937–5944, 2020.
  13. F. Liu, Z. Li, Y. Han, J. Lu, F. Richter, and M. C. Yip, “Real-to-sim registration of deformable soft tissue with position-based dynamics for surgical robot autonomy,” in 2021 IEEE International Conference on Robotics and Automation (ICRA).   IEEE, 2021, pp. 12 328–12 334.
  14. I. Fassi and G. Legnani, “Hand to sensor calibration: A geometrical interpretation of the matrix equation ax= xb,” Journal of Robotic Systems, vol. 22, no. 9, pp. 497–506, 2005.
  15. F. C. Park and B. J. Martin, “Robot sensor calibration: solving ax= xb on the euclidean group,” IEEE Transactions on Robotics and Automation, vol. 10, no. 5, pp. 717–721, 1994.
  16. V. Lepetit, F. Moreno-Noguer, and P. Fua, “Ep n p: An accurate o (n) solution to the p n p problem,” International journal of computer vision, vol. 81, pp. 155–166, 2009.
  17. J. Lambrecht and L. Kästner, “Towards the usage of synthetic data for marker-less pose estimation of articulated robots in rgb images,” in 2019 19th International Conference on Advanced Robotics (ICAR).   IEEE, 2019, pp. 240–247.
  18. J. Lambrecht, “Robust few-shot pose estimation of articulated robots using monocular cameras and deep-learning-based keypoint detection,” in 2019 7th International Conference on Robot Intelligence Technology and Applications (RiTA).   IEEE, 2019, pp. 136–141.
  19. T. E. Lee, J. Tremblay, T. To, J. Cheng, T. Mosier, O. Kroemer, D. Fox, and S. Birchfield, “Camera-to-robot pose estimation from a single image,” in 2020 IEEE International Conference on Robotics and Automation (ICRA).   IEEE, 2020, pp. 9426–9432.
  20. J. Lu, F. Richter, and M. Yip, “Robust keypoint detection and pose estimation of robot manipulators with self-occlusions via sim-to-real transfer,” arXiv preprint arXiv:2010.08054, 2020.
  21. F. Zhong, Z. Wang, W. Chen, K. He, Y. Wang, and Y.-H. Liu, “Hand-eye calibration of surgical instrument for robotic surgery using interactive manipulation,” IEEE Robotics and Automation Letters, vol. 5, no. 2, pp. 1540–1547, 2020.
  22. T. Zhao, W. Zhao, B. D. Hoffman, W. C. Nowlin, and H. Hui, “Efficient vision and kinematic data fusion for robotic surgical instruments and other applications,” Mar. 3 2015, uS Patent 8,971,597.
  23. Y. Li, F. Richter, J. Lu, E. K. Funk, R. K. Orosco, J. Zhu, and M. C. Yip, “Super: A surgical perception framework for endoscopic tissue manipulation with surgical robotics,” IEEE Robotics and Automation Letters, vol. 5, no. 2, pp. 2294–2301, 2020.
  24. A. Reiter, P. K. Allen, and T. Zhao, “Feature classification for tracking articulated surgical tools,” in Medical Image Computing and Computer-Assisted Intervention–MICCAI 2012: 15th International Conference, Nice, France, October 1-5, 2012, Proceedings, Part II 15.   Springer, 2012, pp. 592–600.
  25. ——, “Appearance learning for 3d tracking of robotic surgical tools,” The International Journal of Robotics Research, vol. 33, no. 2, pp. 342–356, 2014.
  26. J. Lu, A. Jayakumari, F. Richter, Y. Li, and M. C. Yip, “Super deep: A surgical perception framework for robotic tissue manipulation using deep learning for feature extraction,” in 2021 IEEE International Conference on Robotics and Automation (ICRA).   IEEE, 2021, pp. 4783–4789.
  27. P. Pastor, M. Kalakrishnan, J. Binney, J. Kelly, L. Righetti, G. Sukhatme, and S. Schaal, “Learning task error models for manipulation,” in 2013 IEEE International Conference on Robotics and Automation.   IEEE, 2013, pp. 2612–2618.
  28. F. Wang, K. Chen, and X. Chen, “An online calibration method for manipulator with joint clearance,” Robot, vol. 35, no. 5, pp. 521–526, 2013.
  29. H. Peng, X. Yang, Y.-H. Su, and B. Hannaford, “Real-time data driven precision estimator for raven-ii surgical robot end effector position,” in 2020 IEEE International Conference on Robotics and Automation (ICRA).   IEEE, 2020, pp. 350–356.
  30. J. Mahler, S. Krishnan, M. Laskey, S. Sen, A. Murali, B. Kehoe, S. Patil, J. Wang, M. Franklin, P. Abbeel, et al., “Learning accurate kinematic control of cable-driven surgical robots using data cleaning and gaussian process regression,” in 2014 IEEE international conference on automation science and engineering (CASE).   IEEE, 2014, pp. 532–539.
  31. M. Haghighipanah, M. Miyasaka, Y. Li, and B. Hannaford, “Unscented kalman filter and 3d vision to improve cable driven surgical robot joint angle estimation,” in 2016 IEEE international conference on robotics and automation (ICRA).   IEEE, 2016, pp. 4135–4142.
  32. D. Seita, S. Krishnan, R. Fox, S. McKinley, J. Canny, and K. Goldberg, “Fast and reliable autonomous surgical debridement with cable-driven robots using a two-phase calibration procedure,” in 2018 IEEE International Conference on Robotics and Automation (ICRA), 2018, pp. 6651–6658.
  33. V. Pradeep, K. Konolige, and E. Berger, “Calibrating a multi-arm multi-sensor robot: A bundle adjustment approach,” in Experimental Robotics: The 12th International Symposium on Experimental Robotics.   Springer, 2014, pp. 211–225.
  34. Q. V. Le and A. Y. Ng, “Joint calibration of multiple sensors,” in 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems.   IEEE, 2009, pp. 3651–3658.
  35. M. Krainin, P. Henry, X. Ren, and D. Fox, “Manipulator and object tracking for in hand model acquisition,” in Proceedings, IEEE International Conference on Robots and Automation, 2010, pp. 1817–1824.
  36. C. G. Cifuentes, J. Issac, M. Wüthrich, S. Schaal, and J. Bohg, “Probabilistic articulated real-time tracking for robot manipulation,” IEEE Robotics and Automation Letters, vol. 2, no. 2, pp. 577–584, 2016.
  37. Z.-Y. Chiu, F. Richter, and M. C. Yip, “Real-time constrained 6d object-pose tracking of an in-hand suture needle for minimally invasive robotic surgery,” arXiv preprint arXiv:2210.11973, 2022.
  38. B. Lu, B. Li, Q. Dou, and Y. Liu, “A unified monocular camera-based and pattern-free hand-to-eye calibration algorithm for surgical robots with rcm constraints,” IEEE/ASME Transactions on Mechatronics, vol. 27, no. 6, pp. 5124–5135, 2022.
  39. F. Chaumette, “La relation vision-commande: théorie et application à des tâches robotiques,” Ph.D. dissertation, L’Université de Rennes I, 1990.
  40. P. Kazanzides, Z. Chen, A. Deguet, G. S. Fischer, R. H. Taylor, and S. P. DiMaio, “An open-source research kit for the da vinci® surgical system,” in 2014 IEEE international conference on robotics and automation (ICRA).   IEEE, 2014, pp. 6434–6439.
  41. M. J. Lum, D. C. Friedman, G. Sankaranarayanan, H. King, K. Fodero, R. Leuschke, B. Hannaford, J. Rosen, and M. N. Sinanan, “The raven: Design and validation of a telesurgery system,” The International Journal of Robotics Research, vol. 28, no. 9, pp. 1183–1197, 2009.
  42. B. Hannaford, J. Rosen, D. W. Friedman, H. King, P. Roan, L. Cheng, D. Glozman, J. Ma, S. N. Kosari, and L. White, “Raven-ii: an open platform for surgical robotics research,” IEEE Transactions on Biomedical Engineering, vol. 60, no. 4, pp. 954–959, 2012.
  43. R. Pautrat, J.-T. Lin, V. Larsson, M. R. Oswald, and M. Pollefeys, “Sold2: Self-supervised occlusion-aware line description and detection,” in Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2021, pp. 11 368–11 378.
  44. F. Richter, E. K. Funk, W. S. Park, R. K. Orosco, and M. C. Yip, “From bench to bedside: The first live robotic surgery on the dvrk to enable remote telesurgery with motion scaling,” in 2021 International Symposium on Medical Robotics (ISMR).   IEEE, 2021, pp. 1–7.
  45. J. Canny, “A computational approach to edge detection,” IEEE Transactions on pattern analysis and machine intelligence, no. 6, pp. 679–698, 1986.
  46. D. H. Ballard, “Generalizing the hough transform to detect arbitrary shapes,” Pattern recognition, vol. 13, no. 2, pp. 111–122, 1981.
  47. W. E. Smith, N. Vakil, and S. A. Maislin, “Correction of distortion in endoscope images,” IEEE Transactions on Medical Imaging, vol. 11, no. 1, pp. 117–122, 1992.
  48. S. Xie and Z. Tu, “Holistically-nested edge detection,” in Proceedings of the IEEE international conference on computer vision, 2015, pp. 1395–1403.
  49. M. A. Fischler and R. C. Bolles, “Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography,” Communications of the ACM, vol. 24, no. 6, pp. 381–395, 1981.
  50. J. Huang, F. Liu, F. Richter, and M. C. Yip, “Model-predictive control of blood suction for surgical hemostasis using differentiable fluid simulations,” in 2021 IEEE International Conference on Robotics and Automation (ICRA).   IEEE, 2021, pp. 12 380–12 386.
Citations (1)

Summary

We haven't generated a summary for this paper yet.