SynBody: Synthetic Dataset with Layered Human Models for 3D Human Perception and Modeling (2303.17368v2)
Abstract: Synthetic data has emerged as a promising source for 3D human research as it offers low-cost access to large-scale human datasets. To advance the diversity and annotation quality of human models, we introduce a new synthetic dataset, SynBody, with three appealing features: 1) a clothed parametric human model that can generate a diverse range of subjects; 2) the layered human representation that naturally offers high-quality 3D annotations to support multiple tasks; 3) a scalable system for producing realistic data to facilitate real-world tasks. The dataset comprises 1.2M images with corresponding accurate 3D annotations, covering 10,000 human body models, 1,187 actions, and various viewpoints. The dataset includes two subsets for human pose and shape estimation as well as human neural rendering. Extensive experiments on SynBody indicate that it substantially enhances both SMPL and SMPL-X estimation. Furthermore, the incorporation of layered annotations offers a valuable training resource for investigating the Human Neural Radiance Fields (NeRF).
- Zhitao Yang (5 papers)
- Zhongang Cai (50 papers)
- Haiyi Mei (11 papers)
- Shuai Liu (215 papers)
- Zhaoxi Chen (49 papers)
- Weiye Xiao (6 papers)
- Yukun Wei (3 papers)
- Zhongfei Qing (3 papers)
- Chen Wei (72 papers)
- Bo Dai (245 papers)
- Wayne Wu (60 papers)
- Chen Qian (226 papers)
- Dahua Lin (336 papers)
- Ziwei Liu (368 papers)
- Lei Yang (373 papers)