Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
184 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

FunGrasp: Functional Grasping for Diverse Dexterous Hands (2411.16755v1)

Published 24 Nov 2024 in cs.RO and cs.CV

Abstract: Functional grasping is essential for humans to perform specific tasks, such as grasping scissors by the finger holes to cut materials or by the blade to safely hand them over. Enabling dexterous robot hands with functional grasping capabilities is crucial for their deployment to accomplish diverse real-world tasks. Recent research in dexterous grasping, however, often focuses on power grasps while overlooking task- and object-specific functional grasping poses. In this paper, we introduce FunGrasp, a system that enables functional dexterous grasping across various robot hands and performs one-shot transfer to unseen objects. Given a single RGBD image of functional human grasping, our system estimates the hand pose and transfers it to different robotic hands via a human-to-robot (H2R) grasp retargeting module. Guided by the retargeted grasping poses, a policy is trained through reinforcement learning in simulation for dynamic grasping control. To achieve robust sim-to-real transfer, we employ several techniques including privileged learning, system identification, domain randomization, and gravity compensation. In our experiments, we demonstrate that our system enables diverse functional grasping of unseen objects using single RGBD images, and can be successfully deployed across various dexterous robot hands. The significance of the components is validated through comprehensive ablation studies. Project page: https://hly-123.github.io/FunGrasp/ .

Summary

  • The paper introduces an H2R grasp retargeting method that maps human grasp poses to diverse robotic hand configurations while preserving functional contact points.
  • The paper employs reinforcement learning to achieve dynamic grasping, demonstrating a 74% success rate in handling unseen household objects.
  • The paper validates sim-to-real transfer through privileged learning and ablation studies, ensuring robust performance across different robotic hand models.

Insights into "FunGrasp: Functional Grasping for Diverse Dexterous Hands"

The paper "FunGrasp: Functional Grasping for Diverse Dexterous Hands" presents a comprehensive approach to advancing the capabilities of robotic hands by enabling task-specific functional grasping that can generalize across different types of robot hands and unseen objects. The authors propose a system that leverages single RGBD images of functional human grasping to generate corresponding retargeted grasps for robotic hands, coupled with dynamic control policies learned through reinforcement learning (RL).

Core Contributions

The paper introduces three primary components:

  1. H2R Grasp Retargeting: The system effectively retargets human grasp poses to robot hands by preserving precise contact points and human-like postures. This is crucial given the morphological differences between human and robotic hands, such as variances in finger numbers and degrees of freedom (DoF).
  2. Dynamic Dexterous Grasping: A reinforcement learning framework is employed to achieve dynamic grasping motions. This component aims to handle diverse object shapes through a tactile perception module, allowing effective one-shot generalization to unseen objects.
  3. Sim-to-Real Transfer: Techniques such as privileged learning and system identification are used to bridge the sim-to-real gap, ensuring the grasping policies can be deployed successfully on real hardware.

Experimental Evaluation

The system's evaluation encompasses both simulation and real-world tests. In the real-world experiments, the system demonstrates a 74% success rate in functional grasping across various unseen household objects, validating its generalization capabilities. The simulation results provide further quantitative insights, showing robust performance across different robotic hand models.

Key Numerical Results

  • The system demonstrates substantial success with a 74% functional grasping success rate on real-world objects unseen during training.
  • In simulation, the system achieves over 75% success rates across multiple robotic hand models, highlighting its adaptability to different hand morphologies.
  • Comprehensive ablation studies illustrate the effectiveness of each system component, particularly the significant improvements brought about by the H2R Grasp Retargeting module and privileged learning frameworks.

Implications

The research contributes significantly to the field of dexterous robot manipulation by addressing task-specific functional grasping, an area often overlooked in favor of power grasps. The ability to accurately mimic human-like grasping poses can propel robotic systems closer to effectively performing everyday tasks in human environments, enhancing their utility in areas such as healthcare and domestic assistance.

Future Prospects

The paper opens pathways for further research into integrating advanced sensing technologies and learning frameworks to enrich the grasping capabilities of robotic systems. Future work might focus on addressing the dependency on known object models for pose estimation, thus enhancing the system's versatility with objects not present in predefined datasets.

Overall, the paper provides a robust framework for advancing the field of robotic grasping, with a holistic approach that encompasses perception, planning, and control, all while ensuring that the system retains the human-like versatility that is essential for real-world applications.

Github Logo Streamline Icon: https://streamlinehq.com