Online Audio Beat Tracking for a Dancing Robot in the Presence of Ego-Motion Noise in a Real Environment

dc.contributor.author João Lobato Oliveira en
dc.contributor.author Keisuke Nakamura en
dc.contributor.author Gökhan Ince en
dc.contributor.author Kazuhiro Nakadai en
dc.date.accessioned 2017-11-17T11:55:54Z
dc.date.available 2017-11-17T11:55:54Z
dc.date.issued 2012 en
dc.description.abstract This paper presents the design and implementation of a real-time real-world beat tracking system which runs on a dancing robot. We propose to incorporate ego noise reduction as a pre-processing stage prior to our tempo induction and beat tracking system. The beat tracking algorithm is based on an online strategy of competing agents sequentially processing a continuous musical input, while considering parallel hypotheses regarding tempo and beats. This system is applied to a humanoid robot processing the audio from its embedded microphones on-the-fly, while performing simplistic dancing motions. A detailed and multi-criteria based evaluation of the system across different music genres and varying stationary/non-stationary noise conditions is presented. It shows improved performance and noise robustness, outperforming our conventional beat tracker (i.e., without ego noise suppression) by 15.2 points in tempo estimation and 15.0 points in beat-times prediction. en
dc.identifier.uri http://repositorio.inesctec.pt/handle/123456789/3304
dc.language eng en
dc.relation 5079 en
dc.rights info:eu-repo/semantics/openAccess en
dc.title Online Audio Beat Tracking for a Dancing Robot in the Presence of Ego-Motion Noise in a Real Environment en
dc.type conferenceObject en
dc.type Publication en
Files