The article discusses a new framework for music-driven 3D dance generation. Existing methods focus on dance quality but lack control during the generation process. The proposed framework allows for high-quality dance movements with multi-modal control, including genre, semantic, and spatial control. The dance generation network is decoupled from the control network, preventing quality degradation when adding control information. Specific control strategies are designed for different control information and integrated into a unified framework. Experimental results show improved motion quality and controllability compared to existing methods.
Publication date: 4 Jan 2024
Project Page: Not provided
Paper: https://arxiv.org/pdf/2401.01382