The article proposes a method named GauFRe for dynamic scene reconstruction using deformable 3D Gaussians, specifically designed for monocular video. The method extends the efficiency of Gaussian Splatting to accommodate dynamic elements via a set of deformable Gaussians residing in a canonical space. A time-dependent deformation field, defined by a multi-layer perceptron (MLP), is also incorporated. The method assumes that most natural scenes have large static regions, and it enables real-time rendering by using a Gaussian Splatting rasterizer. The method is optimized using a self-supervised rendering loss and achieves results comparable to state-of-the-art dynamic neural radiance field methods.

 

Publication date: 18 Dec 2023
Project Page: this url
Paper: https://arxiv.org/pdf/2312.11458