PolyDiff is the first diffusion-based approach capable of directly generating realistic and diverse 3D polygonal meshes. Unlike methods that use alternate 3D shape representations, PolyDiff operates natively on the polygonal mesh data structure. It learns both the geometric properties of vertices and the topological characteristics of faces. PolyDiff treats meshes as quantized triangle soups, progressively corrupted with categorical noise in the forward diffusion phase. In the reverse diffusion phase, a transformer-based denoising network is trained to revert the noising process, restoring the original mesh structure. The model produces high-quality 3D polygonal meshes, ready for integration into downstream 3D workflows. PolyDiff shows significant improvement over current state-of-the-art methods.

 

Publication date: 18 Dec 2023
Project Page: https://arxiv.org/abs/2312.11417
Paper: https://arxiv.org/pdf/2312.11417