Edge Dance
EDGE: Editable Dance Generation from Music is an AI tool that generates high-quality choreographies from music using music embeddings from the Jukebox model.
The tool works by encoding input music into embeddings using a frozen Jukebox model and then using a conditional diffusion model to map the music embeddings to a series of 5-second dance clips.
At inference time, temporal constraints are applied to batches of multiple clips to enforce temporal consistency before stitching them into an arbitrary-length full video.
The tool supports arbitrary spatial and temporal constraints, making it suitable for various end-user applications, including dances subject to joint-wise constraints, motion in-betweening, and dance continuation.
In addition, EDGE has a new Contact Consistency Loss that improves physical realism while keeping sliding intact and avoids unintentional foot sliding, ensuring that generated dances are physically plausible.
The tool has been trained with physical realism in mind and has been shown to outperform previous work, as indicated by human raters’ strong preference for dances generated by EDGE.
Overall, EDGE: Editable Dance Generation from Music is a powerful AI tool suitable for generating high-quality choreographies from music, with potential applications in various industries, including entertainment and the arts.