Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
175 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Cloth2Tex: A Customized Cloth Texture Generation Pipeline for 3D Virtual Try-On (2308.04288v1)

Published 8 Aug 2023 in cs.CV

Abstract: Fabricating and designing 3D garments has become extremely demanding with the increasing need for synthesizing realistic dressed persons for a variety of applications, e.g. 3D virtual try-on, digitalization of 2D clothes into 3D apparel, and cloth animation. It thus necessitates a simple and straightforward pipeline to obtain high-quality texture from simple input, such as 2D reference images. Since traditional warping-based texture generation methods require a significant number of control points to be manually selected for each type of garment, which can be a time-consuming and tedious process. We propose a novel method, called Cloth2Tex, which eliminates the human burden in this process. Cloth2Tex is a self-supervised method that generates texture maps with reasonable layout and structural consistency. Another key feature of Cloth2Tex is that it can be used to support high-fidelity texture inpainting. This is done by combining Cloth2Tex with a prevailing latent diffusion model. We evaluate our approach both qualitatively and quantitatively and demonstrate that Cloth2Tex can generate high-quality texture maps and achieve the best visual effects in comparison to other methods. Project page: tomguluson92.github.io/projects/cloth2tex/

Citations (3)

Summary

  • The paper presents a two-stage process combining neural mesh rendering and latent diffusion to generate high-quality 3D clothing textures.
  • It leverages neural rendering to align 2D images with 3D meshes, overcoming traditional TPS methods for more complete texture mapping.
  • Experimental evaluations demonstrate enhanced texture fidelity with superior SSIM and PSNR metrics, validating its impact on digital fashion.

Cloth2Tex: A Customized Cloth Texture Generation Pipeline for 3D Virtual Try-On

The paper "Cloth2Tex: A Customized Cloth Texture Generation Pipeline for 3D Virtual Try-On" introduces an innovative approach to generating high-quality 3D textures for virtual clothing try-ons, addressing key limitations of existing methodologies in the field. This pipeline facilitates the transition from 2D clothing images to detailed 3D textured meshes, significantly broadening application possibilities in fashion e-commerce and digital garment design.

Methodology Overview

Cloth2Tex operates through a two-stage process:

  1. Shape and Coarse Texture Generation (Phase 1): This stage integrates neural mesh rendering to establish dense correspondences between 2D catalog images and UV textures. Departing from traditional Thin-Plate-Spline (TPS) methods, which often result in incomplete texture maps due to self-occlusions, the proposed approach leverages a neural renderer to optimize meshes and textures by aligning them with image colors, silhouettes, and key points.
  2. Fine Texture Refinement (Phase 2): To mitigate the challenges of refining coarse textures, the authors employ a latent diffusion model (LDM), utilizing ControlNet to simulate high-quality texture maps. This phase includes training a texture inpainting network that fills missing areas in the texture maps using large-scale synthetic data, enabling enhanced texture consistency and detail.

Key Contributions

  • Diverse Template Mesh Models: The paper extends the variety of clothing types that can be effectively textured to over ten categories, surpassing existing efforts.
  • Neural Rendering for Texture Generation: By avoiding dependency on TPS warping, Cloth2Tex yields more accurate and complete initial texture maps, particularly for complex garment shapes.
  • Diffusion Model-Based Inpainting: Using synthetic data generated through a diffusion model, the approach overcomes the scarcity of training data, achieving superior inpainting results.

Experimental Evaluation

The authors conduct comprehensive evaluations, comparing Cloth2Tex against state-of-the-art methods such as Pix2Surf and TPS-based warping. Cloth2Tex consistently demonstrates higher texture fidelity and spatial consistency. Experiments reveal significant improvements in both SSIM and PSNR metrics, emphasizing the method's effectiveness over traditional techniques.

Additionally, a user paper corroborates the superior perceptual quality of Cloth2Tex outputs, reinforcing the method's practical applicability in creating realistic virtual try-on experiences.

Implications and Future Directions

The research presents substantial implications for the virtual try-on industry and 3D modeling of garments. Cloth2Tex provides a streamlined, automated solution to a traditionally manual and error-prone task, suggesting a shift towards more sophisticated, AI-driven processes in digital fashion technology.

Looking ahead, potential developments may include refining the algorithm to better handle garments with intricate patterns and exploring the integration of this technique into more interactive applications, such as AR-enabled virtual shopping experiences.

Cloth2Tex represents a significant step forward in the digital transformation of the fashion industry, offering a robust foundation for future innovations in 3D clothing visualization and modeling.

Youtube Logo Streamline Icon: https://streamlinehq.com