Title
Ginput: a tool for fast hi-fi prototyping of gestural interactions in virtual reality
Abstract
Gestural interfaces in virtual reality (VR) expand the design space for user interaction, allowing spatial metaphors with the environment and more natural and immersive experiences. Typically, machine learning approaches recognize gestures with models that rely on a large number of samples for the training phase, which is an obstacle for rapidly prototyping gestural interactions. In this paper, we propose a solution designed for hi-fi prototyping of gestures within a virtual reality environment through a high-level Domain-Specific Language (DSL), as a subset of the natural language. The proposed DSL allows non-programmer users to intuitively describe a broad domain of poses and connect them for compound gestures. Our DSL was designed to be general enough for multiple input classes, such as body tracking, hand tracking, head movement, motion controllers, and buttons. We tested our solution for wands with VR designers and developers. Results showed that the tool gives non-programmers the ability to prototype gestures with ease and refine its recognition within a few minutes.
Year
DOI
Venue
2020
10.1109/ISMAR-Adjunct51615.2020.00030
2020 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct)
Keywords
DocType
ISBN
Human-centered computing,Visualization,Visualization techniques,Treemaps,Human-centered computing,Visualization,Visualization design and evaluation methods
Conference
978-1-7281-7676-5
Citations 
PageRank 
References 
0
0.34
4
Authors
9