Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
169 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Autonomous Robotic Drilling System for Mice Cranial Window Creation: An Evaluation with an Egg Model (2303.12265v2)

Published 22 Mar 2023 in cs.RO and cs.AI

Abstract: Robotic assistance for experimental manipulation in the life sciences is expected to enable precise manipulation of valuable samples, regardless of the skill of the scientist. Experimental specimens in the life sciences are subject to individual variability and deformation, and therefore require autonomous robotic control. As an example, we are studying the installation of a cranial window in a mouse. This operation requires the removal of the skull, which is approximately 300 um thick, to cut it into a circular shape 8 mm in diameter, but the shape of the mouse skull varies depending on the strain of mouse, sex and week of age. The thickness of the skull is not uniform, with some areas being thin and others thicker. It is also difficult to ensure that the skulls of the mice are kept in the same position for each operation. It is not realistically possible to measure all these features and pre-program a robotic trajectory for individual mice. The paper therefore proposes an autonomous robotic drilling method. The proposed method consists of drilling trajectory planning and image-based task completion level recognition. The trajectory planning adjusts the z-position of the drill according to the task completion level at each discrete point, and forms the 3D drilling path via constrained cubic spline interpolation while avoiding overshoot. The task completion level recognition uses a DSSD-inspired deep learning model to estimate the task completion level of each discrete point. Since an egg has similar characteristics to a mouse skull in terms of shape, thickness and mechanical properties, removing the egg shell without damaging the membrane underneath was chosen as the simulation task. The proposed method was evaluated using a 6-DOF robotic arm holding a drill and achieved a success rate of 80% out of 20 trials.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (21)
  1. H. Koike, K. Iwasawa, R. Ouchi, M. Maezawa, K. Giesbrecht, N. Saiki, A. Ferguson, M. Kimura, W. L. Thompson, J. M. Wells, A. M. Zorn, and T. Takebe, “Modelling human hepato-biliary-pancreatic organogenesis from the foregut–midgut boundary,” Nature, vol. 574, no. 7776, pp. 112–116, sep 2019.
  2. P. T. Ly, A. Lucas, S. H. Pun, A. Dondzillo, C. Liu, A. Klug, and T. C. Lei, “Robotic stereotaxic system based on 3d skull reconstruction to improve surgical accuracy and speed,” jan 2020.
  3. L. Ghanbari, M. L. Rynes, J. Hu, D. S. Schulman, G. W. Johnson, M. Laroque, G. M. Shull, and S. B. Kodandaramaiah, “Craniobot: A computer numerical controlled robot for cranial microsurgeries,” Scientific Reports, vol. 9, no. 1, jan 2019.
  4. N. Pak, J. H. Siegle, J. P. Kinney, D. J. Denman, T. J. Blanche, and E. S. Boyden, “Closed-loop, ultraprecise, automated craniotomies,” Journal of Neurophysiology, vol. 113, no. 10, pp. 3943–3953, jun 2015.
  5. M. Marques Marinho, J. José Quiroz-Omaña, and K. Harada, “Design and validation of a multi-arm robotic platform for scientific exploration,” arXiv e-prints, pp. arXiv–2210, 2022.
  6. L. Andreoli, H. Simplício, and E. Morya, “Egg model training protocol for stereotaxic neurosurgery and microelectrode implantation,” World Neurosurgery, vol. 111, pp. 243–250, 2018.
  7. Y. Hu, H. Jin, L. Zhang, P. Zhang, and J. Zhang, “State recognition of pedicle drilling with force sensing in a robotic spinal surgical system,” IEEE/ASME Transactions on Mechatronics, vol. 19, no. 1, pp. 357–365, feb 2014.
  8. Y. Dai, Y. Xue, and J. Zhang, “Drilling electrode for real-time measurement of electrical impedance in bone tissues,” Annals of Biomedical Engineering, vol. 42, no. 3, pp. 579–588, nov 2013.
  9. Z. Ying, L. Shu, and N. Sugita, “Autonomous penetration perception for bone cutting during laminectomy,” in 2020 8th IEEE RAS/EMBS International Conference for Biomedical Robotics and Biomechatronics (BioRob).   IEEE, nov 2020.
  10. B. M. Pohl, A. Schumacher, and U. G. Hofmann, “Towards an automated, minimal invasive, precision craniotomy on small animals,” in 2011 5th International IEEE/EMBS Conference on Neural Engineering.   IEEE, apr 2011.
  11. N. Dvornik, K. Shmelkov, J. Mairal, and C. Schmid, “Blitznet: A real-time deep network for scene understanding,” 2017.
  12. R. Girshick, “Fast r-cnn,” 2015.
  13. S. Ren, K. He, R. Girshick, and J. Sun, “Faster r-cnn: Towards real-time object detection with region proposal networks,” 2015.
  14. K. He, G. Gkioxari, P. Dollár, and R. Girshick, “Mask r-cnn,” 2017.
  15. W. Liu, D. Anguelov, D. Erhan, C. Szegedy, S. Reed, C.-Y. Fu, and A. C. Berg, “Ssd: Single shot multibox detector,” 2015.
  16. C.-Y. Fu, W. Liu, A. Ranga, A. Tyagi, and A. C. Berg, “Dssd : Deconvolutional single shot detector,” 2017.
  17. K. Simonyan and A. Zisserman, “Very deep convolutional networks for large-scale image recognition,” 2014.
  18. H. Noh, S. Hong, and B. Han, “Learning deconvolution network for semantic segmentation,” 2015.
  19. I. Kokkinos, “Ubernet: Training a ‘universal’ convolutional neural network for low-, mid-, and high-level vision using diverse datasets and limited memory,” 2016.
  20. C. Kruger, “Constrained cubic spline interpolation,” Chemical Engineering Applications, vol. 1, no. 1, 2003.
  21. B. V. Adorno and M. M. Marinho, “DQ Robotics: a library for robot modeling and control,” IEEE Robotics and Automation Magazine (RAM), vol. 28, no. 3, pp. 102–116, Sep. 2021, invited for presentation at IROS’21. [Online]. Available: https://arxiv.org/pdf/1910.11612
Citations (2)

Summary

We haven't generated a summary for this paper yet.