Sim-to-Real Transfer of Robotic Assembly with Visual Inputs Using CycleGAN and Force Control (2208.14104v1)
Abstract: Recently, deep reinforcement learning (RL) has shown some impressive successes in robotic manipulation applications. However, training robots in the real world is nontrivial owing to sample efficiency and safety concerns. Sim-to-real transfer is proposed to address the aforementioned concerns but introduces a new issue called the reality gap. In this work, we introduce a sim-to-real learning framework for vision-based assembly tasks and perform training in a simulated environment by employing inputs from a single camera to address the aforementioned issues. We present a domain adaptation method based on cycle-consistent generative adversarial networks (CycleGAN) and a force control transfer approach to bridge the reality gap. We demonstrate that the proposed framework trained in a simulated environment can be successfully transferred to a real peg-in-hole setup.
- Chengjie Yuan (3 papers)
- Yunlei Shi (6 papers)
- Qian Feng (35 papers)
- Chunyang Chang (3 papers)
- Zhaopeng Chen (21 papers)
- Alois Christian Knoll (3 papers)
- Jianwei Zhang (114 papers)