SC-GS: Sparse-Controlled Gaussian Splatting for Editable Dynamic Scenes

SC-GS: Sparse-Controlled Gaussian Splatting for Editable Dynamic Scenes

CVPR 2024
1The University of Hong Kong, 2VAST, 3Zhejiang University
*Equal Contribution, #Corresponding Author
1 2 3

Abstract

Novel view synthesis for dynamic scenes is still a challenging problem in computer vision and graphics. Recently, Gaussian splatting has emerged as a robust technique to represent static scenes and enable high-quality and real-time novel view synthesis. Building upon this technique, we propose a new representation that explicitly decomposes the motion and appearance of dynamic scenes into sparse control points and dense Gaussians, respectively. Our key idea is to use sparse control points, significantly fewer in number than the Gaussians, to learn compact 6 DoF transformation bases, which can be locally interpolated through learned interpolation weights to yield the motion field of 3D Gaussians. We employ a deformation MLP to predict time-varying 6 DoF transformations for each control point, which reduces learning complexities, enhances learning abilities, and facilitates obtaining temporal and spatial coherent motion patterns. Then, we jointly learn the 3D Gaussians, the canonical space locations of control points, and the deformation MLP to reconstruct the appearance, geometry, and dynamics of 3D scenes. During learning, the location and number of control points are adaptively adjusted to accommodate varying motion complexities in different regions, and an ARAP loss following the principle of as rigid as possible is developed to enforce spatial continuity and local rigidity of learned motions. Finally, thanks to the explicit sparse motion representation and its decomposition from appearance, our method can enable user-controlled motion editing while retaining high-fidelity appearances. Extensive experiments demonstrate that our approach outperforms existing approaches on novel view synthesis with a high rendering speed and enables novel appearance-preserved motion editing applications.

Video

Editing Guidance

We offer user-friendly editing interfaces and comprehensive guidance to empower users in creating their ideal 3D assets.

Dynamic View Synthesis

Motion Editing

Interactive Motion Editing Tool

Users are allowed to interactively edit the scene motion via deforming the learned graph of control points.

More Results on HyperNeRF Scenes

BibTeX

@article{huang2023sc,
        title={SC-GS: Sparse-Controlled Gaussian Splatting for Editable Dynamic Scenes},
        author={Huang, Yi-Hua and Sun, Yang-Tian and Yang, Ziyi and Lyu, Xiaoyang and Cao, Yan-Pei and Qi, Xiaojuan},
        journal={arXiv preprint arXiv:2312.14937},
        year={2023}
      }

Acknowledgements

The website template is borrowed from Nerfies.