Reconstructing Objects in-the-wild for Realistic Sensor Simulation (2311.05602v1)
Abstract: Reconstructing objects from real world data and rendering them at novel views is critical to bringing realism, diversity and scale to simulation for robotics training and testing. In this work, we present NeuSim, a novel approach that estimates accurate geometry and realistic appearance from sparse in-the-wild data captured at distance and at limited viewpoints. Towards this goal, we represent the object surface as a neural signed distance function and leverage both LiDAR and camera sensor data to reconstruct smooth and accurate geometry and normals. We model the object appearance with a robust physics-inspired reflectance representation effective for in-the-wild data. Our experiments show that NeuSim has strong view synthesis performance on challenging scenarios with sparse training views. Furthermore, we showcase composing NeuSim assets into a virtual world and generating realistic multi-sensor data for evaluating self-driving perception models.
- Ze Yang (51 papers)
- Sivabalan Manivasagam (19 papers)
- Yun Chen (134 papers)
- Jingkang Wang (20 papers)
- Rui Hu (96 papers)
- Raquel Urtasun (161 papers)