Abstract | ||
---|---|---|
This work addresses the challenge of synchronizing multiple sources of visible and audible information from a variety of devices, while capturing human motion in realtime. Video and audio data will be used to augment and enrich a motion capture database that will be released to the research community. While other such augmented motion capture databases exist [Black and Sigal 2006], the goal of this work is to build on these previous works. Critical areas of improvement are in the synchronization between cameras and synchronization between devices. Adding an array of audio recording devices to the setup will greatly expand the research potential of the database, and the positioning of the cameras will be varied to give greater flexibility. The augmented database will facilitate the testing and validation of human pose estimation and motion tracking techniques, among other applications. This sketch briefly describes some of the interesting challenges faced in setting up the pipeline for capturing the synchronized data and the novel approaches proposed to solve them. |
Year | DOI | Venue |
---|---|---|
2009 | 10.1145/1666778.1666828 | SIGGRAPH Asia 2013 Posters |
Keywords | Field | DocType |
research community,audio recording device,audio data,human motion,motion capture database,augmented motion capture databases,capture system,synchronized real-time multi-sensor motion,research potential,audible information,augmented database,previous work,depth buffer,motion capture,real time,gpgpu,pose estimation,motion tracking | Computer vision,Motion capture,Synchronization,Computer graphics (images),Computer science,Synchronizing,Pose,Artificial intelligence,Sound recording and reproduction,Facial motion capture,Match moving,Sketch | Conference |
Citations | PageRank | References |
0 | 0.34 | 1 |
Authors | ||
4 |
Name | Order | Citations | PageRank |
---|---|---|---|
Jonathan Ruttle | 1 | 12 | 2.75 |
Michael Manzke | 2 | 44 | 9.19 |
Martin Prazak | 3 | 0 | 0.34 |
Rozenn Dahyot | 4 | 340 | 32.62 |