Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

TEXterity: Tactile Extrinsic deXterity (2401.10230v2)

Published 18 Jan 2024 in cs.RO

Abstract: We introduce a novel approach that combines tactile estimation and control for in-hand object manipulation. By integrating measurements from robot kinematics and an image-based tactile sensor, our framework estimates and tracks object pose while simultaneously generating motion plans to control the pose of a grasped object. This approach consists of a discrete pose estimator that uses the Viterbi decoding algorithm to find the most likely sequence of object poses in a coarsely discretized grid, and a continuous pose estimator-controller to refine the pose estimate and accurately manipulate the pose of the grasped object. Our method is tested on diverse objects and configurations, achieving desired manipulation objectives and outperforming single-shot methods in estimation accuracy. The proposed approach holds potential for tasks requiring precise manipulation in scenarios where visual perception is limited, laying the foundation for closed-loop behavior applications such as assembly and tool use. Please see supplementary videos for real-world demonstration at https://sites.google.com/view/texterity.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (60)
  1. M. Bauza, F. R. Hogan, and A. Rodriguez, “A data-efficient approach to precise and controlled pushing,” in Conference on Robot Learning.   PMLR, 2018, pp. 336–345.
  2. I. Mordatch, Z. Popović, and E. Todorov, “Contact-invariant optimization for hand manipulation,” in Proceedings of the ACM SIGGRAPH/Eurographics symposium on computer animation, 2012, pp. 137–144.
  3. B. Sundaralingam and T. Hermans, “Geometric in-hand regrasp planning: Alternating optimization of finger gaits and in-grasp manipulation,” in 2018 IEEE International Conference on Robotics and Automation (ICRA).   IEEE, 2018, pp. 231–238.
  4. Y. Hou, Z. Jia, and M. T. Mason, “Fast planning for 3d any-pose-reorienting using pivoting,” in 2018 IEEE International Conference on Robotics and Automation (ICRA).   IEEE, 2018, pp. 1631–1638.
  5. J. Shi, J. Z. Woodruff, P. B. Umbanhowar, and K. M. Lynch, “Dynamic in-hand sliding manipulation,” IEEE Transactions on Robotics, vol. 33, no. 4, pp. 778–795, 2017.
  6. B. Sundaralingam and T. Hermans, “Relaxed-rigidity constraints: kinematic trajectory optimization and collision avoidance for in-grasp manipulation,” Autonomous Robots, vol. 43, pp. 469–483, 2019.
  7. T. Chen, J. Xu, and P. Agrawal, “A system for general in-hand object re-orientation,” in Conference on Robot Learning.   PMLR, 2022, pp. 297–307.
  8. A. Rajeswaran, V. Kumar, A. Gupta, G. Vezzani, J. Schulman, E. Todorov, and S. Levine, “Learning complex dexterous manipulation with deep reinforcement learning and demonstrations,” arXiv preprint arXiv:1709.10087, 2017.
  9. T. Chen, M. Tippur, S. Wu, V. Kumar, E. Adelson, and P. Agrawal, “Visual dexterity: In-hand dexterous manipulation from depth,” in Icml workshop on new frontiers in learning, control, and dynamical systems, 2023.
  10. A. Handa, A. Allshire, V. Makoviychuk, A. Petrenko, R. Singh, J. Liu, D. Makoviichuk, K. Van Wyk, A. Zhurkevich, B. Sundaralingam et al., “Dextreme: Transfer of agile in-hand manipulation from simulation to reality,” in 2023 IEEE International Conference on Robotics and Automation (ICRA).   IEEE, 2023, pp. 5977–5984.
  11. O. M. Andrychowicz, B. Baker, M. Chociej, R. Jozefowicz, B. McGrew, J. Pachocki, A. Petron, M. Plappert, G. Powell, A. Ray et al., “Learning dexterous in-hand manipulation,” The International Journal of Robotics Research, vol. 39, no. 1, pp. 3–20, 2020.
  12. W. Huang, I. Mordatch, P. Abbeel, and D. Pathak, “Generalization in dexterous manipulation via geometry-aware multi-task learning,” arXiv preprint arXiv:2111.03062, 2021.
  13. W. Yuan, S. Dong, and E. H. Adelson, “Gelsight: High-resolution robot tactile sensors for estimating geometry and force,” Sensors, vol. 17, no. 12, p. 2762, 2017.
  14. M. Lambeta, P.-W. Chou, S. Tian, B. Yang, B. Maloon, V. R. Most, D. Stroud, R. Santos, A. Byagowi, G. Kammerer et al., “Digit: A novel design for a low-cost compact high-resolution tactile sensor with application to in-hand manipulation,” IEEE Robotics and Automation Letters, vol. 5, no. 3, pp. 3838–3845, 2020.
  15. I. H. Taylor, S. Dong, and A. Rodriguez, “Gelslim 3.0: High-resolution measurement of shape, force and slip in a compact tactile-sensing finger,” in 2022 International Conference on Robotics and Automation (ICRA).   IEEE, 2022, pp. 10 781–10 787.
  16. M. Bauza, A. Bronars, and A. Rodriguez, “Tac2pose: Tactile object pose estimation from the first touch,” arXiv preprint arXiv:2204.11701, 2022.
  17. S. Pai, T. Chen, M. Tippur, E. Adelson, A. Gupta, and P. Agrawal, “Tactofind: A tactile only system for object retrieval,” arXiv preprint arXiv:2303.13482, 2023.
  18. S. Luo, W. Yuan, E. Adelson, A. G. Cohn, and R. Fuentes, “Vitac: Feature sharing between vision and tactile sensing for cloth texture recognition,” in 2018 IEEE International Conference on Robotics and Automation (ICRA).   IEEE, 2018, pp. 2722–2727.
  19. S. Kim and A. Rodriguez, “Active extrinsic contact sensing: Application to general peg-in-hole insertion,” in 2022 International Conference on Robotics and Automation (ICRA).   IEEE, 2022, pp. 10 241–10 247.
  20. D. Ma, S. Dong, and A. Rodriguez, “Extrinsic contact sensing with relative-motion tracking from distributed tactile measurements,” in 2021 IEEE international conference on robotics and automation (ICRA).   IEEE, 2021, pp. 11 262–11 268.
  21. C. Higuera, S. Dong, B. Boots, and M. Mukadam, “Neural contact fields: Tracking extrinsic contact with tactile sensing,” in 2023 IEEE International Conference on Robotics and Automation (ICRA).   IEEE, 2023, pp. 12 576–12 582.
  22. S. Dong, D. K. Jha, D. Romeres, S. Kim, D. Nikovski, and A. Rodriguez, “Tactile-rl for insertion: Generalization to objects of unknown geometry,” in 2021 IEEE International Conference on Robotics and Automation (ICRA).   IEEE, 2021, pp. 6437–6443.
  23. F. R. Hogan, J. Ballester, S. Dong, and A. Rodriguez, “Tactile dexterity: Manipulation primitives with tactile feedback,” in 2020 IEEE international conference on robotics and automation (ICRA).   IEEE, 2020, pp. 8863–8869.
  24. Y. Shirai, D. K. Jha, A. U. Raghunathan, and D. Hong, “Tactile tool manipulation,” in 2023 International Conference on Robotics and Automation (ICRA).   IEEE, 2023.
  25. Y. She, S. Wang, S. Dong, N. Sunil, A. Rodriguez, and E. Adelson, “Cable manipulation with a tactile-reactive gripper,” The International Journal of Robotics Research, vol. 40, no. 12-14, pp. 1385–1401, 2021.
  26. N. Sunil, S. Wang, Y. She, E. Adelson, and A. R. Garcia, “Visuotactile affordances for cloth manipulation with local control,” in Conference on Robot Learning.   PMLR, 2023, pp. 1596–1606.
  27. N. C. Dafle, A. Rodriguez, R. Paolini, B. Tang, S. S. Srinivasa, M. Erdmann, M. T. Mason, I. Lundberg, H. Staab, and T. Fuhlbrigge, “Extrinsic dexterity: In-hand manipulation with external forces,” in 2014 IEEE International Conference on Robotics and Automation (ICRA).   IEEE, 2014, pp. 1578–1585.
  28. S. Kim, D. K. Jha, D. Romeres, P. Patre, and A. Rodriguez, “Simultaneous tactile estimation and control of extrinsic contact,” 2023.
  29. P. Sodhi, M. Kaess, M. Mukadam, and S. Anderson, “Learning tactile models for factor graph-based estimation,” in 2021 IEEE International Conference on Robotics and Automation (ICRA).   IEEE, 2021, pp. 13 686–13 692.
  30. P. Sodhi, M. Kaess, M. Mukadanr, and S. Anderson, “Patchgraph: In-hand tactile tracking with learned surface normals,” in 2022 International Conference on Robotics and Automation (ICRA).   IEEE, 2022, pp. 2164–2170.
  31. J. Zhao, M. Bauza, and E. H. Adelson, “Fingerslam: Closed-loop unknown object localization and reconstruction from visuo-tactile feedback,” arXiv preprint arXiv:2303.07997, 2023.
  32. S. Suresh, M. Bauza, K.-T. Yu, J. G. Mangelson, A. Rodriguez, and M. Kaess, “Tactile slam: Real-time inference of shape and pose from planar pushing,” in 2021 IEEE International Conference on Robotics and Automation (ICRA).   IEEE, 2021, pp. 11 322–11 328.
  33. M. Bauza, O. Canal, and A. Rodriguez, “Tactile mapping and localization from high-resolution tactile imprints,” in 2019 International Conference on Robotics and Automation (ICRA).   IEEE, 2019, pp. 3811–3817.
  34. R. Li, R. Platt, W. Yuan, A. Ten Pas, N. Roscup, M. A. Srinivasan, and E. Adelson, “Localization and manipulation of small parts using gelsight tactile sensing,” in 2014 IEEE/RSJ International Conference on Intelligent Robots and Systems.   IEEE, 2014, pp. 3988–3993.
  35. M. Bauza, A. Bronars, Y. Hou, I. Taylor, N. Chavan-Dafle, and A. Rodriguez, “simple: a visuotactile method learned in simulation to precisely pick, localize, regrasp, and place objects,” 2023.
  36. S. Dikhale, K. Patel, D. Dhingra, I. Naramura, A. Hayashi, S. Iba, and N. Jamali, “Visuotactile 6d pose estimation of an in-hand object using vision and tactile sensor data,” IEEE Robotics and Automation Letters, vol. 7, no. 2, pp. 2148–2155, 2022.
  37. G. Izatt, G. Mirano, E. Adelson, and R. Tedrake, “Tracking objects with point clouds from vision and touch,” in 2017 IEEE International Conference on Robotics and Automation (ICRA).   IEEE, 2017, pp. 4000–4007.
  38. T. Anzai and K. Takahashi, “Deep gated multi-modal learning: In-hand object pose changes estimation using tactile and image data,” in 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).   IEEE, 2020, pp. 9361–9368.
  39. S. Suresh, Z. Si, S. Anderson, M. Kaess, and M. Mukadam, “Midastouch: Monte-carlo inference over distributions across sliding touch,” in Conference on Robot Learning.   PMLR, 2023, pp. 319–331.
  40. T. Kelestemur, R. Platt, and T. Padir, “Tactile pose estimation and policy learning for unknown object manipulation,” arXiv preprint arXiv:2203.10685, 2022.
  41. Y. Hou, Z. Jia, and M. T. Mason, “Manipulation with shared grasping,” arXiv preprint arXiv:2006.02996, 2020.
  42. J. Shi, H. Weng, and K. M. Lynch, “In-hand sliding regrasp with spring-sliding compliance and external constraints,” IEEE Access, vol. 8, pp. 88 729–88 744, 2020.
  43. N. Chavan-Dafle, R. Holladay, and A. Rodriguez, “In-hand manipulation via motion cones,” arXiv preprint arXiv:1810.00219, 2018.
  44. N. Chavan-Dafle and A. Rodriguez, “Prehensile pushing: In-hand manipulation with push-primitives,” in 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).   IEEE, 2015, pp. 6215–6222.
  45. A. Nagabandi, K. Konolige, S. Levine, and V. Kumar, “Deep dynamics models for learning dexterous manipulation,” in Conference on Robot Learning.   PMLR, 2020, pp. 1101–1112.
  46. V. Kumar, E. Todorov, and S. Levine, “Optimal control with learned local models: Application to dexterous manipulation,” in 2016 IEEE International Conference on Robotics and Automation (ICRA).   IEEE, 2016, pp. 378–383.
  47. S. Tian, F. Ebert, D. Jayaraman, M. Mudigonda, C. Finn, R. Calandra, and S. Levine, “Manipulation by feel: Touch-based control with deep predictive models,” in 2019 International Conference on Robotics and Automation (ICRA).   IEEE, 2019, pp. 818–824.
  48. M. Lepert, C. Pan, S. Yuan, R. Antonova, and J. Bohg, “In-hand manipulation of unknown objects with tactile sensing for insertion,” in Embracing Contacts-Workshop at ICRA 2023, 2023.
  49. J. Pitz, L. Röstel, L. Sievers, and B. Bäuml, “Dextrous tactile in-hand manipulation using a modular reinforcement learning architecture,” arXiv preprint arXiv:2303.04705, 2023.
  50. Z.-H. Yin, B. Huang, Y. Qin, Q. Chen, and X. Wang, “Rotating without seeing: Towards in-hand dexterity through touch,” arXiv preprint arXiv:2303.10880, 2023.
  51. H. Van Hoof, T. Hermans, G. Neumann, and J. Peters, “Learning robot in-hand manipulation with tactile features,” in 2015 IEEE-RAS 15th International Conference on Humanoid Robots (Humanoids).   IEEE, 2015, pp. 121–127.
  52. L. Sievers, J. Pitz, and B. Bäuml, “Learning purely tactile in-hand manipulation with a torque-controlled hand,” in 2022 International Conference on Robotics and Automation (ICRA).   IEEE, 2022, pp. 2745–2751.
  53. G. D. Forney, “The viterbi algorithm,” Proceedings of the IEEE, vol. 61, no. 3, pp. 268–278, 1973.
  54. G. Zhou, N. Kumar, A. Dedieu, M. Lázaro-Gredilla, S. Kushagra, and D. George, “Pgmax: Factor graphs for discrete probabilistic graphical models and loopy belief propagation in jax,” arXiv preprint arXiv:2202.04110, 2022.
  55. M. Kaess, H. Johannsson, R. Roberts, V. Ila, J. J. Leonard, and F. Dellaert, “isam2: Incremental smoothing and mapping using the bayes tree,” The International Journal of Robotics Research, vol. 31, no. 2, pp. 216–235, 2012.
  56. F. Dellaert, “Factor graphs and gtsam: A hands-on introduction,” Georgia Institute of Technology, Tech. Rep, vol. 2, p. 4, 2012.
  57. F. Dellaert, M. Kaess et al., “Factor graphs for robot perception,” Foundations and Trends® in Robotics, vol. 6, no. 1-2, pp. 1–139, 2017.
  58. S. Goyal, “Planar sliding of a rigid body with dry friction: limit surfaces and dynamics of motion,” Ph.D. dissertation, Cornell University Ithaca, NY, 1989.
  59. T. Inoue, G. De Magistris, A. Munawar, T. Yokoya, and R. Tachibana, “Deep reinforcement learning for high precision assembly tasks,” in 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).   IEEE, 2017, pp. 819–825.
  60. T. Z. Zhao, J. Luo, O. Sushkov, R. Pevceviciute, N. Heess, J. Scholz, S. Schaal, and S. Levine, “Offline meta-reinforcement learning for industrial insertion,” in 2022 International Conference on Robotics and Automation (ICRA).   IEEE, 2022, pp. 6386–6393.
User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Antonia Bronars (8 papers)
  2. Sangwoon Kim (7 papers)
  3. Parag Patre (3 papers)
  4. Alberto Rodriguez (79 papers)
Citations (1)

Summary

  • The paper introduces a novel framework that fuses tactile sensing with robot kinematics for precise in-hand object manipulation.
  • It employs a discrete Viterbi estimator refined by a continuous controller to significantly reduce pose estimation errors.
  • Experimental results on sub-millimeter insertion tasks demonstrate enhanced precision and robustness in real-world applications.

An Expert Analysis of "TEXterity: Tactile Extrinsic deXterity"

The paper "TEXterity: Tactile Extrinsic deXterity" presents a comprehensive approach to enhancing the dexterity of robotic systems through tactile feedback integration, focusing on the intricate task of in-hand object manipulation. This work stands out by developing a framework that combines measurements from image-based tactile sensors and robot kinematics to estimate and control the pose of objects held within a robotic gripper. The goal is to achieve precise manipulation, especially in conditions where visual input is limited, thereby advancing the capabilities required for tasks such as assembly and tool handling.

Core Contributions

The approach detailed in the paper consists of two main components: a discrete pose estimator using the Viterbi decoding algorithm and a continuous pose estimator-controller. The discrete estimator forecasts the most probable sequence of object poses on a coarse grid, which is then refined by the continuous estimator-controller for accurate manipulation. The innovative fusion of tactile sensing with proprioceptive feedback differentiates the proposed method from conventional single-shot estimation techniques, improving the accuracy of object pose detection under occluded conditions.

Technical Insights

  1. Architecture and Methods: The paper proposes a structured combination of discrete and continuous evaluation models to achieve tactile-based state estimation. By leveraging a high-resolution tactile sensing interface, the system can infer object poses with higher precision compared to standalone methods. The Viterbi-based model filters out ambiguous pose predictions typically resulting from single tactile assessments, while the subsequent continuous refinement step fine-tunes these predictions, enhancing overall system responsiveness and reliability.
  2. Estimation Accuracy: Testing across various object types and configurations showcases the robustness of the proposed system. The paper demonstrates significant improvements in estimation accuracy, reducing normalized estimation errors consistently when transitioning from single-shot predictions to combined discrete and continuous methods. This structured approach allows the system to bridge the gap between high-resolution data capture and practical manipulation tasks, positioning it as a viable solution for precision-oriented applications.
  3. Implementation and Results: The experimental validation includes tests on multiple object profiles and an insertion task requiring sub-millimeter precision, effectively demonstrating the feasibility of the approach in real-world tasks. While the system shows high success rates, particularly for objects with tapered profiles, its performance underscores the importance of tactile feedback in achieving desired insertion and reorientation tasks.

Implications and Future Directions

The practical implications of this research are multi-faceted. Within industries relying heavily on robotic automation, enhanced tactile feedback systems promise increased flexibility and reliability in environments where visual sensory data might be compromised. In scenarios requiring delicate manipulation, such as electronic assembly or surgical robotics, the ability to reorient and reposition tools with precision is invaluable.

Theoretically, the paper's findings open avenues for strengthening tactile feedback integration in robotic systems. Future research could explore automated strategies for determining optimal grasp and reorientation paths based on task requirements, further reducing the reliance on pre-specified configurations. Additionally, integrating this system with machine learning algorithms capable of adapting to variable conditions and new object geometries could elevate the adaptability and intelligence of robotic platforms.

Conclusion

This paper sets the stage for sophisticated tactile-based manipulation strategies that enhance robotic dexterity through a symbiosis of tactile sensing and kinematic feedback. While promising, the pursuit of tactile extrinsic dexterity continues to be an evolving challenge, poised for significant breakthroughs as research delves deeper into optimizing control frameworks and sensing capabilities. The journey towards fully autonomous and adaptable robotic manipulators remains intricate, yet studies like this lay down substantial groundwork towards that vision.

Youtube Logo Streamline Icon: https://streamlinehq.com