Title
Learning To Dance Through Interactive Evolution
Abstract
A relatively rare application of artificial intelligence at the nexus of art and music is dance. The impulse shared by all humans to express ourselves through dance represents a unique opportunity to artificially capture human creative expression. In particular, the spontaneity and relative ease of moving to the music without any overall plan suggests a natural connection between temporal patterns and motor control. To explore this potential, this paper presents a model called Dance Evolution, which allows the user to train virtual humans to dance to MIDI songs or raw audio, that is, the dancers can dance to any song heard on the radio, including the latest pop music. The dancers are controlled by artificial neural networks (ANNs) that "hear" MIDI sequences or raw audio processed through a discrete Fourier transform-based technique. ANNs learn to dance in new ways through an interactive evolutionary process driven by the user. The main result is that when motion is expressed as a function of sound the effect is a plausible approximation of the natural human tendency to move to music. © 2010 Springer-Verlag Berlin Heidelberg.
Publication Date
1-1-2010
Publication Title
Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume
6025 LNCS
Issue
PART 2
Number of Pages
331-340
Document Type
Article; Proceedings Paper
Personal Identifier
scopus
DOI Link
https://doi.org/10.1007/978-3-642-12242-2_34
Copyright Status
Unknown
Socpus ID
77952338367 (Scopus)
Source API URL
https://api.elsevier.com/content/abstract/scopus_id/77952338367
STARS Citation
Dubbin, Greg A. and Stanley, Kenneth O., "Learning To Dance Through Interactive Evolution" (2010). Scopus Export 2010-2014. 1740.
https://stars.library.ucf.edu/scopus2010/1740