Abstract | ||
---|---|---|
We present our framework that efficiently synchronizes dancing motion of a virtual character according to music input. Two modules have been developed to achieve this. A music server analyzes audio input on the fly and extracts information on music such as tempo, time signature, and beat time. A motion client chooses and displays motion clips according to the music information, by time-warping and synchronizing the start time of motion clips with the music beat time. As a preprocessing step, we construct a database of motion clips using captured dancing motion of a dancer. A PCA-based method is presented to easily identify transiting parts from each motion clip. |
Year | DOI | Venue |
---|---|---|
2007 | 10.1109/ICIS.2007.137 | ACIS-ICIS |
Keywords | Field | DocType |
time-warping,music information,motion control,virtual reality,music,pca-based method,music server,synchronizing,dancing character,virtual character,animation,databases,indexing,multiple signal classification,computer science,robots,time warping | Computer vision,Time signature,Motion control,Virtual reality,Computer graphics (images),Dynamic time warping,Computer science,Synchronizing,Search engine indexing,Beat (music),Animation,Artificial intelligence | Conference |
ISBN | Citations | PageRank |
0-7695-2841-4 | 5 | 0.65 |
References | Authors | |
17 | 3 |
Name | Order | Citations | PageRank |
---|---|---|---|
Gunwoo Kim | 1 | 92 | 7.13 |
Yan Wang | 2 | 5 | 0.65 |
Hyewon Seo | 3 | 349 | 31.09 |