Title
Interpreting 2D gesture annotations in 3D augmented reality
Abstract
A 2D gesture annotation provides a simple way to annotate the physical world in augmented reality for a range of applications such as remote collaboration. When rendered from novel viewpoints, these annotations have previously only worked with statically positioned cameras or planar scenes. However, if the camera moves and is observing an arbitrary environment, 2D gesture annotations can easily lose their meaning when shown from novel viewpoints due to perspective effects. In this paper, we present a new approach towards solving this problem by using a gesture enhanced annotation interpretation. By first classifying which type of gesture the user drew, we show that it is possible to render the 2D annotations in 3D in a way that conforms more to the original intention of the user than with traditional methods. We first determined a generic vocabulary of important 2D gestures for an augmented reality enhanced remote collaboration scenario by running an Amazon Mechanical Turk study with 88 participants. Next, we designed a novel real-time method to automatically handle the two most common 2D gesture annotations - arrows and circles - and give a detailed analysis of the ambiguities that must be handled in each case. Arrow gestures are interpreted by identifying their anchor points and using scene surface normals for better perspective rendering. For circle gestures, we designed a novel energy function to help infer the object of interest using both 2D image cues and 3D geometric cues. Results indicate that our method outperforms previous approaches by better conveying the meaning of the original drawing from different viewpoints.
Year
DOI
Venue
2016
10.1109/3DUI.2016.7460046
2016 IEEE Symposium on 3D User Interfaces (3DUI)
Keywords
Field
DocType
H.5.1 [Information Interfaces and Presentation]: Multimedia Information Systems — Artificial, augmented, and virtual realities,H.5.2 [Information Interfaces and Presentation]: User Interfaces — Interaction styles
Computer vision,Annotation,Gesture,Viewpoints,Computer science,Gesture recognition,Augmented reality,Human–computer interaction,Artificial intelligence,Rendering (computer graphics),User interface,Vocabulary
Conference
Citations 
PageRank 
References 
3
0.39
27
Authors
4
Name
Order
Citations
PageRank
Benjamin Nuernberger11025.91
Kuo-Chin Lien2956.46
Tobias Höllerer32666244.50
Matthew Turk43724499.42