Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Rotating Objects via In-Hand Pivoting using Vision, Force and Touch (2303.10865v3)

Published 20 Mar 2023 in cs.RO

Abstract: We propose a robotic manipulation system that can pivot objects on a surface using vision, wrist force and tactile sensing. We aim to control the rotation of an object around the grip point of a parallel gripper by allowing rotational slip, while maintaining a desired wrist force profile. Our approach runs an end-effector position controller and a gripper width controller concurrently in a closed loop. The position controller maintains a desired force using vision and wrist force. The gripper controller uses tactile sensing to keep the grip firm enough to prevent translational slip, but loose enough to induce rotational slip. Our sensor-based control approach relies on matching a desired force profile derived from object dimensions and weight and vision-based monitoring of the object pose. The gripper controller uses tactile sensors to detect and prevent translational slip by tightening the grip when needed. Experimental results where the robot was tasked with rotating cuboid objects 90 degrees show that the multi-modal pivoting approach was able to rotate the objects without causing lift or slip, and was more energy-efficient compared to using a single sensor modality and to pick-and-place. While our work demonstrated the benefit of multi-modal sensing for the pivoting task, further work is needed to generalize our approach to any given object.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (31)
  1. Y. Aiyama, M. Inaba, and H. Inoue, “Pivoting: A new method of graspless manipulation of object by robot fingers,” in IROS, 1993.
  2. S. Saeedvand, H. Mandala, and J. Baltes, “Hierarchical deep reinforcement learning to drag heavy objects by adult-sized humanoid robot,” Applied Soft Computing, 2021.
  3. N. C. Dafle, A. Rodriguez, R. Paolini, B. Tang, S. S. Srinivasa, M. Erdmann, M. T. Mason, I. Lundberg, H. Staab, and T. Fuhlbrigge, “Extrinsic dexterity: In-hand manipulation with external forces,” in ICRA, 2014.
  4. M. Costanzo, G. De Maria, and C. Natale, “Slipping control algorithms for object manipulation with sensorized parallel grippers,” in ICRA, 2018.
  5. T. M. Huh, H. Choi, S. Willcox, S. Moon, and M. R. Cutkosky, “Dynamically reconfigurable tactile sensor for robotic manipulation,” IEEE Robotics and Automation Letters, 2020.
  6. M. Meier, F. Patzelt, R. Haschke, and H. J. Ritter, “Tactile convolutional networks for online slip and rotation detection,” in International Conference on Artificial Neural Networks, pp. 12–19, Springer, 2016.
  7. J. Toskov, R. Newbury, M. Mukadam, D. Kulić, and A. Cosgun, “In-hand gravitational pivoting using tactile sensing,” in Conference on Robot Learning, 2022.
  8. M. Raessa, W. Wan, and K. Harada, “Planning to repose long and heavy objects considering a combination of regrasp and constrained drooping,” Assembly Automation, vol. 41, no. 3, pp. 324–332, 2021.
  9. A. Zhang, K. Koyama, W. Wan, and K. Harada, “Manipulation planning for large objects through pivoting, tumbling, and regrasping,” Applied Sciences, vol. 11, no. 19, p. 9103, 2021.
  10. F. Shi, M. Zhao, M. Murooka, K. Okada, and M. Inaba, “Aerial regrasping: Pivoting with transformable multilink aerial robot,” in ICRA, 2020.
  11. E. Yoshida, M. Poirier, J.-P. Laumond, O. Kanoun, F. Lamiraux, R. Alami, and K. Yokoi, “Pivoting based manipulation by a humanoid robot,” Autonomous Robots, vol. 28, no. 1, pp. 77–88, 2010.
  12. R. Antonova, S. Cruciani, C. Smith, and D. Kragic, “Reinforcement learning for pivoting task,” arXiv preprint arXiv:1703.00472, 2017.
  13. Y. Hou, Z. Jia, and M. T. Mason, “Fast planning for 3d any-pose-reorienting using pivoting,” in ICRA, pp. 1631–1638, IEEE, 2018.
  14. Y. Hou, Z. Jia, and M. T. Mason, “Reorienting objects in 3d space using pivoting,” arXiv preprint arXiv:1912.02752, 2019.
  15. B. Siciliano and L. Villani, Robot force control. Springer, 1999.
  16. D. Ma, S. Dong, and A. Rodriguez, “Extrinsic contact sensing with relative-motion tracking from distributed tactile measurements,” in ICRA, 2021.
  17. A. Molchanov, O. Kroemer, Z. Su, and G. S. Sukhatme, “Contact localization on grasped objects using tactile sensing,” in IROS, 2016.
  18. N. Doshi, O. Taylor, and A. Rodriguez, “Manipulation of unknown objects via contact configuration regulation,” in ICRA, 2022.
  19. F. R. Hogan, J. Ballester, S. Dong, and A. Rodriguez, “Tactile dexterity: Manipulation primitives with tactile feedback,” in ICRA, 2020.
  20. K.-T. Yu and A. Rodriguez, “Realtime state estimation with tactile and visual sensing for inserting a suction-held object,” in IROS, 2018.
  21. Q. Li, O. Kroemer, Z. Su, F. F. Veiga, M. Kaboli, and H. J. Ritter, “A review of tactile information: Perception and action through touch,” IEEE Transactions on Robotics, vol. 36, no. 6, pp. 1619–1634, 2020.
  22. M. Costanzo, G. De Maria, and C. Natale, “Two-fingered in-hand object handling based on force/tactile feedback,” IEEE Transactions on Robotics, vol. 36, no. 1, pp. 157–173, 2019.
  23. M. Costanzo, G. De Maria, and C. Natale, “Detecting and controlling slip through estimation and control of the sliding velocity,” Applied Sciences, vol. 13, no. 2, p. 921, 2023.
  24. C. Wang, X. Zang, H. Zhang, H. Chen, Z. Lin, and J. Zhao, “Status identification and object in-hand reorientation using force/torque sensors,” IEEE Sensors Journal, vol. 21, no. 18, pp. 20694–20703, 2021.
  25. Y. Chen, C. Prepscius, D. Lee, and D. D. Lee, “Tactile velocity estimation for controlled in-grasp sliding,” IEEE Robotics and Automation Letters, vol. 6, no. 2, pp. 1614–1621, 2021.
  26. T. Kunz and M. Stilman, “Turning paths into trajectories using parabolic blends,” tech. rep., Georgia Institute of Technology, 2011.
  27. H. Khamis, R. I. Albero, M. Salerno, A. S. Idil, A. Loizou, and S. J. Redmond, “Papillarray: An incipient slip sensor for dexterous robotic or prosthetic manipulation–design and prototype validation,” Sensors and Actuators A: Physical, vol. 270, pp. 195–204, 2018.
  28. J. Guo, X. Xing, W. Quan, D.-M. Yan, Q. Gu, Y. Liu, and X. Zhang, “Efficient center voting for object detection and 6d pose estimation in 3d point cloud,” IEEE Transactions on Image Processing, 2021.
  29. Y. Xiang, T. Schmidt, V. Narayanan, and D. Fox, “Posecnn: A convolutional neural network for 6d object pose estimation in cluttered scenes,” 2018.
  30. J. Tremblay, T. To, B. Sundaralingam, Y. Xiang, D. Fox, and S. Birchfield, “Deep object pose estimation for semantic robotic grasping of household objects,” in Conference on Robot Learning (CoRL), 2018.
  31. A. Mousavian, C. Eppner, and D. Fox, “6-dof graspnet: Variational grasp generation for object manipulation,” in IEEE/CVF International Conference on Computer Vision (CVPR), 2019.
User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Shiyu Xu (9 papers)
  2. Tianyuan Liu (9 papers)
  3. Michael Wong (6 papers)
  4. Akansel Cosgun (59 papers)
  5. Dana Kulić (38 papers)
Citations (1)

Summary

We haven't generated a summary for this paper yet.