The article presents CTGAN (Semantic-guided Conditional Texture Generator), a new method for generating high-quality textures for 3D models. Traditional methods of creating textures can be time-consuming and subjective. CTGAN overcomes these limitations by using StyleGAN’s disentangled nature to finely manipulate the input latent codes, enabling explicit control over both the style and structure of the generated textures. A coarse-to-fine encoder architecture is introduced to enhance control over the structure of the resulting textures. CTGAN outperforms existing methods on multiple quality metrics and achieves state-of-the-art performance on texture generation.
Publication date: 8 Feb 2024
Project Page: https://doi.org/XXXXXXX.XXXXXXX
Paper: https://arxiv.org/pdf/2402.05728