Abstract | ||
---|---|---|
We proposed real-time robust body part tracking for augmented reality interface that does not limit the user's freedom. The generality of the system was upgraded relative to body part tracking by establishing an ability to recognize details, such as, whether the user wears long sleeves or short sleeves. For precise body part tracking, we obtained images of hands, head, and feet separately via a single camera, and when detecting each body part, we separately chose appropriate features for specific parts. Using a calibrated camera, we transferred 2D detected body parts into an approximate 3D posture. In experiments conducted to evaluate the body part tracking module, the application with the proposed interface showed advanced hand tracking performance in real time(43.5fps). |
Year | DOI | Venue |
---|---|---|
2009 | 10.1145/1670252.1670295 | VRCAI |
Keywords | Field | DocType |
precise body part tracking,proposed interface,real-time robust body part,single camera,body part tracking module,body part,advanced hand tracking performance,specific part,augmented reality interface,body part tracking,augmented reality,real time | Computer vision,Computer graphics (images),Computer science,Simulation,Tracking system,Augmented reality,Artificial intelligence,Generality | Conference |
Citations | PageRank | References |
4 | 0.46 | 8 |
Authors | ||
3 |
Name | Order | Citations | PageRank |
---|---|---|---|
Jinki Jung | 1 | 33 | 7.50 |
Kyusung Cho | 2 | 45 | 7.29 |
Hyun S. Yang | 3 | 288 | 35.12 |