AnyRotate: Gravity-Invariant In-Hand Object Rotation with Sim-to-Real Touch (2405.07391v3)
Abstract: Human hands are capable of in-hand manipulation in the presence of different hand motions. For a robot hand, harnessing rich tactile information to achieve this level of dexterity still remains a significant challenge. In this paper, we present AnyRotate, a system for gravity-invariant multi-axis in-hand object rotation using dense featured sim-to-real touch. We tackle this problem by training a dense tactile policy in simulation and present a sim-to-real method for rich tactile sensing to achieve zero-shot policy transfer. Our formulation allows the training of a unified policy to rotate unseen objects about arbitrary rotation axes in any hand direction. In our experiments, we highlight the benefit of capturing detailed contact information when handling objects of varying properties. Interestingly, we found rich multi-fingered tactile sensing can detect unstable grasps and provide a reactive behavior that improves the robustness of the policy. The project website can be found at https://maxyang27896.github.io/anyrotate/.
- Max Yang (6 papers)
- Chenghua Lu (5 papers)
- Alex Church (11 papers)
- Yijiong Lin (12 papers)
- Chris Ford (5 papers)
- Haoran Li (168 papers)
- Efi Psomopoulou (7 papers)
- David A. W. Barton (18 papers)
- Nathan F. Lepora (49 papers)