Keyframe Control of Music-driven 3D Dance Generation

Published in IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS, 2023

Abstract: For 3D animators, choreography with artificial intelligence has attracted more attention recently. However, most existing deep learning methods mainly rely on music for dance generation and lack sufficient control over generated dance motions. To address this issue, we introduce the idea of keyframe interpolation for music-driven dance generation and present a novel transition generation technique for choreography. Specifically, this technique synthesizes visually diverse and plausible dance motions by using normalizing flows to learn the probability distribution of dance motions conditioned on a piece of music and a sparse set of key poses. Thus, the generated dance motions respect both the input musical beats and the key poses. To achieve a robust transition of varying lengths between the key poses, we introduce a time embedding at each timestep as an additional condition. Extensive experiments show that our model generates more realistic, diverse, and beat-matching dance motions than the compared state-of-the-art methods, both qualitatively and quantitatively. Our experimental results demonstrate the superiority of the keyframe-based control for improving the diversity of the generated dance motions.

Download paper here

Recommended citation: Zhipeng Yang, Yu-Hui Wen, Shu-Yu Chen, Xiao Liu, Yuan Gao, Yong-Jin Liu*. Lin Gao, Hongbo Fu. Keyframe Control of Music-driven 3D Dance Generation. In: IEEE Transactions on Visualization and Computer Graphics. 2023.