A parameterizable spatiotemporal representation of popular dance styles for humanoid dancing characters

dc.contributor.author Marc Leman en
dc.contributor.author João Lobato Oliveira en
dc.contributor.author Luiz Naveda en
dc.contributor.author Fabien Gouyon en
dc.contributor.author Luís Paulo Reis en
dc.contributor.author Paulo Sousa en
dc.date.accessioned 2017-11-16T13:42:47Z
dc.date.available 2017-11-16T13:42:47Z
dc.date.issued 2012 en
dc.description.abstract In this article, we formalize a model for the analysis and representation of popular dance styles of repetitive gestures by specifying the parameters and validation procedures necessary to describe the spatiotemporal elements of the dance movement in relation to its music temporal structure (musical meter). Our representation model is able to precisely describe the structure of dance gestures according to the structure of musical meter, at different temporal resolutions, and is flexible enough to convey the variability of the spatiotemporal relation between music structure and movement in space. It results in a compact and discrete mid-level representation of the dance that can be further applied to algorithms for the generation of movements in different humanoid dancing characters. en
dc.identifier.uri http://repositorio.inesctec.pt/handle/123456789/2469
dc.language eng en
dc.relation 5079 en
dc.relation 4847 en
dc.rights info:eu-repo/semantics/openAccess en
dc.title A parameterizable spatiotemporal representation of popular dance styles for humanoid dancing characters en
dc.type article en
dc.type Publication en
Files