Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 162 tok/s
Gemini 2.5 Pro 48 tok/s Pro
GPT-5 Medium 40 tok/s Pro
GPT-5 High 31 tok/s Pro
GPT-4o 112 tok/s Pro
Kimi K2 204 tok/s Pro
GPT OSS 120B 437 tok/s Pro
Claude Sonnet 4.5 38 tok/s Pro
2000 character limit reached

Using Fiber Optic Bundles to Miniaturize Vision-Based Tactile Sensors (2403.05500v4)

Published 8 Mar 2024 in cs.RO

Abstract: Vision-based tactile sensors have recently become popular due to their combination of low cost, very high spatial resolution, and ease of integration using widely available miniature cameras. The associated field of view and focal length, however, are difficult to package in a human-sized finger. In this paper we employ optical fiber bundles to achieve a form factor that, at 15 mm diameter, is smaller than an average human fingertip. The electronics and camera are also located remotely, further reducing package size. The sensor achieves a spatial resolution of 0.22 mm and a minimum force resolution 5 mN for normal and shear contact forces. With these attributes, the DIGIT Pinki sensor is suitable for applications such as robotic and teleoperated digital palpation. We demonstrate its utility for palpation of the prostate gland and show that it can achieve clinically relevant discrimination of prostate stiffness for phantom and ex vivo tissue.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (51)
  1. M. Lambeta, P.-W. Chou, S. Tian, B. Yang, B. Maloon, V. R. Most, D. Stroud, R. Santos, A. Byagowi, G. Kammerer et al., “Digit: A novel design for a low-cost compact high-resolution tactile sensor with application to in-hand manipulation,” IEEE Robotics and Automation Letters, vol. 5, no. 3, pp. 3838–3845, 2020.
  2. W. Yuan, S. Dong, and E. H. Adelson, “Gelsight: High-resolution robot tactile sensors for estimating geometry and force,” Sensors, vol. 17, no. 12, p. 2762, 2017.
  3. N. F. Lepora, “Soft biomimetic optical tactile sensing with the tactip: A review,” IEEE Sensors Journal, vol. 21, no. 19, pp. 21 131–21 143, 2021.
  4. I. H. Taylor, S. Dong, and A. Rodriguez, “Gelslim 3.0: High-resolution measurement of shape, force and slip in a compact tactile-sensing finger,” in 2022 International Conference on Robotics and Automation (ICRA).   IEEE, 2022, pp. 10 781–10 787.
  5. S. Wang, Y. She, B. Romero, and E. Adelson, “Gelsight wedge: Measuring high-resolution 3d contact geometry with a compact robot finger,” in 2021 IEEE International Conference on Robotics and Automation (ICRA).   IEEE, 2021, pp. 6468–6475.
  6. A. Padmanabha, F. Ebert, S. Tian, R. Calandra, C. Finn, and S. Levine, “OmniTact: A multi-directional high-resolution touch sensor,” in IEEE International Conference on Robotics and Automation (ICRA), 2020, pp. 618–624. [Online]. Available: https://arxiv.org/abs/2003.06965
  7. M. H. Tippur and E. H. Adelson, “Gelsight360: An omnidirectional camera-based tactile sensor for dexterous robotic manipulation,” in 2023 IEEE International Conference on Soft Robotics (RoboSoft).   IEEE, 2023, pp. 1–8.
  8. H. Sun, K. J. Kuchenbecker, and G. Martius, “A soft thumb-sized vision-based sensor with accurate all-round force perception,” Nature Machine Intelligence, vol. 4, no. 2, pp. 135–145, 2022.
  9. L. Wang, B. Lu, M. He, Y. Wang, Z. Wang, and L. Du, “Prostate cancer incidence and mortality: global status and temporal trends in 89 countries from 2000 to 2019,” Frontiers in Public Health, vol. 10, p. 176, 2022.
  10. D. S. Smith and W. J. Catalona, “Interexaminer variability of digital rectal examination in detecting prostate cancer,” Urology, vol. 45, no. 1, pp. 70–74, 1995.
  11. J. W. Garrett, “The adult human hand: some anthropometric and biomechanical considerations,” Human factors, vol. 13, no. 2, pp. 117–131, 1971.
  12. D. F. Gomes and S. Luo, “Geltip tactile sensor for dexterous manipulation in clutter,” in Tactile Sensing, Skill Learning, and Robotic Dexterous Manipulation.   Elsevier, 2022, pp. 3–21.
  13. S. Q. Liu and E. H. Adelson, “Gelsight fin ray: Incorporating tactile sensing into a soft compliant robotic gripper,” in 2022 IEEE 5th International Conference on Soft Robotics (RoboSoft).   IEEE, 2022, pp. 925–931.
  14. J. Zhao and E. H. Adelson, “Gelsight svelte: A human finger-shaped single-camera tactile robot finger with large sensing coverage and proprioceptive sensing,” in 2023 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).   IEEE, 2023, pp. 8979–8984.
  15. W. Zeng, L. Shu, Q. Li, S. Chen, F. Wang, and X.-M. Tao, “Fiber-based wearable electronics: a review of materials, fabrication, devices, and applications,” Advanced materials, vol. 26, no. 31, pp. 5310–5336, 2014.
  16. F. Taffoni, D. Formica, P. Saccomandi, G. Di Pino, and E. Schena, “Optical fiber-based mr-compatible sensors for medical applications: An overview,” Sensors, vol. 13, no. 10, pp. 14 105–14 120, 2013.
  17. C. M. Lee, C. J. Engelbrecht, T. D. Soper, F. Helmchen, and E. J. Seibel, “Scanning fiber endoscopy with highly flexible, 1 mm catheterscopes for wide-field, full-color imaging,” Journal of biophotonics, vol. 3, no. 5-6, pp. 385–407, 2010.
  18. A. Orth, M. Ploschner, E. Wilson, I. Maksymov, and B. Gibson, “Optical fiber bundles: Ultra-slim light field imaging probes,” Science advances, vol. 5, no. 4, p. eaav1555, 2019.
  19. U. H. Shah, R. Muthusamy, D. Gan, Y. Zweiri, and L. Seneviratne, “On the design and development of vision-based tactile sensors,” Journal of Intelligent & Robotic Systems, vol. 102, pp. 1–27, 2021.
  20. J.-S. Heo, J.-Y. Kim, and J.-J. Lee, “Tactile sensors using the distributed optical fiber sensors,” in 2008 3rd International Conference on Sensing Technology.   IEEE, 2008, pp. 486–490.
  21. H. Maekawa, K. Tanie, and K. Komoriya, “A finger-shaped tactile sensor using an optical waveguide,” in Proceedings of IEEE Systems Man and Cybernetics Conference-SMC, vol. 5.   IEEE, 1993, pp. 403–408.
  22. S. Begej, “Planar and finger-shaped optical tactile sensors for robotic applications,” IEEE Journal on Robotics and Automation, vol. 4, no. 5, pp. 472–484, 1988.
  23. H. Xie, A. Jiang, H. A. Wurdemann, H. Liu, L. D. Seneviratne, and K. Althoefer, “Magnetic resonance-compatible tactile force sensor using fiber optics and vision sensor,” IEEE Sensors Journal, vol. 14, no. 3, pp. 829–838, 2013.
  24. B. Ali, M. A. Ayub, and H. Yussof, “Characteristics of a new optical tactile sensor for interactive robot fingers,” International Journal of Social Robotics, vol. 4, pp. 85–91, 2012.
  25. H. Yussof, J. Wada, and M. Ohka, “Sensorization of robotic hand using optical three-axis tactile sensor: Evaluation with grasping and twisting motions,” Journal of Computer Science, vol. 6, no. 8, p. 955, 2010.
  26. S. Boppart, T. Deutsch, and D. Rattner, “Optical imaging technology in minimally invasive surgery: current status and future directions,” Surgical endoscopy, vol. 13, pp. 718–722, 1999.
  27. H. A. C. Wood, K. Harrington, J. M. Stone, T. A. Birks, and J. C. Knight, “Quantitative characterization of endoscopic imaging fibers,” Opt. Express, vol. 25, no. 3, pp. 1985–1992, Feb 2017. [Online]. Available: https://opg.optica.org/oe/abstract.cfm?URI=oe-25-3-1985
  28. A. R. Rouse, A. Kano, J. A. Udovich, S. M. Kroto, and A. F. Gmitro, “Design and demonstration of a miniature catheter for a confocal microendoscope,” Applied optics, vol. 43, no. 31, pp. 5763–5771, 2004.
  29. S. F. Elahi and T. D. Wang, “Future and advances in endoscopy,” Journal of biophotonics, vol. 4, no. 7-8, pp. 471–481, 2011.
  30. M. I. Tiwana, S. J. Redmond, and N. H. Lovell, “A review of tactile sensing technologies with applications in biomedical engineering,” Sensors and Actuators A: physical, vol. 179, pp. 17–31, 2012.
  31. W. Othman, Z.-H. A. Lai, C. Abril, J. S. Barajas-Gamboa, R. Corcelles, M. Kroh, and M. A. Qasaimeh, “Tactile sensing for minimally invasive surgery: Conventional methods and potential emerging tactile technologies,” Frontiers in Robotics and AI, p. 376, 2022.
  32. C. Huang, Q. Wang, M. Zhao, C. Chen, S. Pan, and M. Yuan, “Tactile perception technologies and their applications in minimally invasive surgery: a review,” Frontiers in Physiology, vol. 11, p. 611596, 2020.
  33. S. Wang, K. Wang, R. Tang, J. Qiao, H. Liu, and Z.-G. Hou, “Design of a low-cost miniature robot to assist the covid-19 nasopharyngeal swab sampling,” IEEE Transactions on Medical Robotics and Bionics, vol. 3, no. 1, pp. 289–293, 2020.
  34. S. Li, M. He, W. Ding, L. Ye, X. Wang, J. Tan, J. Yuan, and X.-P. Zhang, “Visuotactile sensor enabled pneumatic device towards compliant oropharyngeal swab sampling,” arXiv preprint arXiv:2305.06537, 2023. [Online]. Available: https://arxiv.org/pdf/2305.06537.pdf
  35. J. Konstantinova, A. Jiang, K. Althoefer, P. Dasgupta, and T. Nanayakkara, “Implementation of tactile sensing for palpation in robot-assisted minimally invasive surgery: A review,” IEEE Sensors Journal, vol. 14, no. 8, pp. 2490–2501, 2014.
  36. X. Jia, R. Li, M. A. Srinivasan, and E. H. Adelson, “Lump detection with a gelsight sensor,” in World Haptics Conference (WHC), 2013, pp. 175–179.
  37. S. Laufer, E. R. Cohen, C. Kwan, A.-L. D. D’Angelo, R. Yudkowsky, J. R. Boulet, W. C. McGaghie, and C. M. Pugh, “Sensor technology in assessments of clinical skill,” New England Journal of Medicine, vol. 372, no. 8, pp. 784–786, 2015.
  38. L. Naji, H. Randhawa, Z. Sohani, B. Dennis, D. Lautenbach, O. Kavanagh, M. Bawor, L. Banfield, and J. Profetto, “Digital rectal examination for prostate cancer screening in primary care: a systematic review and meta-analysis,” The Annals of Family Medicine, vol. 16, no. 2, pp. 149–154, 2018.
  39. S. Bott, M. Young, M. Kellett, M. Parkinson, and C. to the UCL Hospitals’ Trust Radical Prostatectomy Database, “Anterior prostate cancer: is it more difficult to diagnose?” BJU international, vol. 89, no. 9, pp. 886–889, 2002.
  40. J. T. Wei, D. Barocas, S. Carlsson, F. Coakley, S. Eggener, R. Etzioni, S. W. Fine, M. Han, S. K. Kim, E. Kirkby et al., “Early detection of prostate cancer: Aua/suo guideline part i: prostate cancer screening,” The Journal of Urology, vol. 210, no. 1, pp. 46–53, 2023.
  41. B. Ahn, H. Lee, Y. Kim, and J. Kim, “Robotic system with sweeping palpation and needle biopsy for prostate cancer diagnosis,” The International Journal of Medical Robotics and Computer Assisted Surgery, vol. 10, no. 3, pp. 356–367, 2014.
  42. M. Tanaka, M. Furubayashi, Y. Tanahashi, and S. Chonan, “Development of an active palpation sensor for detecting prostatic cancer and hypertrophy,” Smart materials and structures, vol. 9, no. 6, p. 878, 2000.
  43. A. Iele, A. Ricciardi, C. Pecorella, A. Cirillo, F. Ficuciello, B. Siciliano, R. La Rocca, V. Mirone, M. Consales, and A. Cusano, “Miniaturized optical fiber probe for prostate cancer screening,” Biomedical Optics Express, vol. 12, no. 9, pp. 5691–5703, 2021.
  44. O. Sanni, G. Bonvicini, M. A. Khan, P. C. López-Custodio, K. Nazari et al., “Deep movement primitives: toward breast cancer examination robot,” in Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, no. 11, 2022, pp. 12 126–12 134.
  45. K. A. Nichols and A. M. Okamura, “Methods to segment hard inclusions in soft tissue during autonomous robotic palpation,” IEEE Transactions on Robotics, vol. 31, no. 2, pp. 344–354, 2015.
  46. H. Qi, B. Yi, S. Suresh, M. Lambeta, Y. Ma, R. Calandra, and J. Malik, “General in-hand object rotation with vision and touch,” in Conference on Robot Learning (CORL), 2023. [Online]. Available: https://arxiv.org/abs/2309.09979
  47. K. He, X. Zhang, S. Ren, and J. Sun, “Deep residual learning for image recognition,” in Proceedings of the IEEE conference on computer vision and pattern recognition, 2016, pp. 770–778.
  48. H. Hertz, “Über die berührung fester elastischer körper.” J reine und angewandte Mathematik, vol. 92, p. 156, 1881.
  49. S. E. Navarro, S. S. Dhaliwal, M. S. Lopez, S. Wilby, A. L. Palmer, W. Polak, R. Merzouki, and C. Duriez, “A bio-inspired active prostate phantom for adaptive interventions,” IEEE Transactions on Medical Robotics and Bionics, vol. 4, no. 2, pp. 300–310, 2021.
  50. Z. Tong, Y. Song, J. Wang, and L. Wang, “Videomae: Masked autoencoders are data-efficient learners for self-supervised video pre-training,” Advances in neural information processing systems, vol. 35, pp. 10 078–10 093, 2022.
  51. L. A. Baumgart, G. J. Gerling, and E. J. Bass, “Characterizing the range of simulated prostate abnormalities palpable by digital rectal examination,” Cancer epidemiology, vol. 34, no. 1, pp. 79–84, 2010.
Citations (5)

Summary

  • The paper introduces a novel miniaturization approach that integrates fiber optic bundles with vision-based tactile sensors to overcome size limitations.
  • It details the design, fabrication, and testing of a compact sensor capable of high-resolution tactile imaging and precise force detection.
  • The sensor's performance in differentiating material textures and its successful application in simulated tissue palpation underscore its potential for advanced medical robotics.

Miniaturization of Vision-Based Tactile Sensors through Fiber Optic Bundles

Introduction to Tactile Sensing Innovations

As artificial intelligence and robotics continue to intertwine with medical applications, the demand for precise and versatile tactile sensors grows. In the context of medical robotics, these sensors play a pivotal role in enhancing the dexterity of robotic systems, especially in sensitive tasks such as tissue palpation. A paper conducted by a multidisciplinary team spanning institutions including Stanford University and Meta introduces a novel approach to tactile sensing by leveraging fiber optic bundles to significantly reduce the size of vision-based tactile sensors. This paper, published in the IEEE Transactions On Robotics, details the design, fabrication, and evaluation of the DIGIT Pinki sensor, aiming at applications in medical robotics and beyond.

Design and Fabrication

The paper commences with an exploration of the current landscape of tactile sensing, identifying the limitations posed by size and flexibility in existing technologies. The authors propose a design that encapsulates a vision-based system within a significantly reduced footprint, using fiber optic bundles to transmit tactile information captured by a miniature camera. The innovative design comprises three main components:

  • Sensing Element: Utilizes a deformable surface to capture tactile imprints, which are then conveyed via fiber optic bundles.
  • Imaging System: A compact assembly that includes a camera module capable of receiving the tactile information transmitted through the fiber optic bundles.
  • Illumination System: Ensures consistent lighting for the transmitted images to accurately represent the tactile interaction.

The fabrication process is meticulously detailed, offering insights into the challenges and solutions encountered when miniaturizing components without sacrificing the sensor's fidelity and functionality.

Sensor Characterization

A comprehensive characterization of the sensor's capabilities underlines its performance in terms of resolution, sensitivity, and range. Benchmark tests reveal the sensor's adeptness in capturing high-resolution tactile imprints, asserting its potential utility in precision-demanding applications such as robotic surgery. The sensor's ability to discern varying levels of force and texture provides a strong foundation for sophisticated manipulation tasks.

Practical Applications

The paper validates the sensor's practicality through a series of palpation experiments. Initially, silicone samples of varying hardness are used to simulate tissue consistency, demonstrating the sensor's efficacy in distinguishing subtle differences in material properties. Subsequently, a case paper involving prostate palpation showcases the sensor's potential in medical diagnostics, particularly in identifying tumor-infested tissues.

Theoretical and Future Implications

While the immediate implications of this research accentuate advancements in medical robotics, the theoretical contributions lay a foundation for future explorations in miniaturized tactile sensing. The application of fiber optic bundles in transmitting tactile information opens avenues for further reducing the sensor size, enhancing flexibility, and integrating sensors into a broader range of medical instruments and robotic systems.

Conclusion

This research presents a significant step towards overcoming the size limitations inherent in current vision-based tactile sensors. The miniaturization achieved through the innovative use of fiber optics not only broadens the scope of applications in medical robotics but also sets a precedent for future developments in tactile sensing technology. As the field advances, the integration of such miniaturized sensors in robotic systems promises to enhance their functionality, making them more adept at performing complex tasks with precision and sensitivity akin to human touch.

Dice Question Streamline Icon: https://streamlinehq.com
Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets

This paper has been mentioned in 6 tweets and received 368 likes.

Upgrade to Pro to view all of the tweets about this paper:

Youtube Logo Streamline Icon: https://streamlinehq.com