The article introduces DeepShaRM, a novel multi-view method for 3D reconstruction of textureless, non-Lambertian objects under unknown natural illumination. DeepShaRM achieves state-of-the-art accuracy by estimating reflectance and illumination as a compound reflectance map rather than disentangling them. The method alternates between a deep reflectance map estimation network and a deep shape-from-shading network. Extensive experiments on synthetic and real-world data demonstrate its accuracy.

 

Publication date: 26 Oct 2023
Project Page: https://vision.ist.i.kyoto-u.ac.jp/
Paper: https://arxiv.org/pdf/2310.17632