Abstract | ||
---|---|---|
An almost ubiquitous user interaction in most HCI applications is the task of selecting one of out of a given list of options. For example, in common desktop environments, the user moves the mouse pointer to the desired option and clicks it. The analog of this action in projector-camera HCI environments involves the user raising her finger to touch one of the different virtual buttons projected on a display surface. In this paper, we discuss some of the challenges involved in tracking and recognizing this task in an projected immersive environment and present a hierarchical vision based approach to detect intuitive gesture-based "mouse clicks" in a front-projected virtual interface. Given the difficulty of tracking user gestures directly in a projected environment, our approach first tracks shadows cast on the display by the user and exploits the multi-view geometry of the camera-projector pair to constrain a subsequent search for the users hand position in the scene. The method only requires a simple setup step in which the projector's epipole in the cameraýs frame is estimated. We demonstrate how this approach is capable of detecting a contact event as a user interacts with a virtual pushbutton display. Results demonstrate that camera-based monitoring of user gesture is feasible even under difficult conditions in which the user is illluminated by changing and saturated colors. |
Year | DOI | Venue |
---|---|---|
2004 | 10.1007/0-387-27890-7_12 | CVPR Workshops |
Field | DocType | ISBN |
Computer vision,Virtual reality,Epipolar geometry,Visualization,Computer science,Gesture,Projector,Human–computer interaction,Immersion (virtual reality),Artificial intelligence,User interface,Application software | Conference | 0-7695-2158-4 |
Citations | PageRank | References |
4 | 0.65 | 16 |
Authors | ||
3 |
Name | Order | Citations | PageRank |
---|---|---|---|
Amit Kale | 1 | 708 | 48.47 |
Kenneth Kwan | 2 | 4 | 0.65 |
Christopher Jaynes | 3 | 245 | 20.92 |