Title
Use of Sound to Provide Occluded Visual Information in Touch Gestural Interface
Abstract
Direct touch gestures are getting popular as an input modality for mobile and tabletop interaction. However, the finger touch interface is considered as not accurate compared with pen-based interface. One of the main reasons is that the visual feedback of the finger touch is occluded because of the size of fingertip. It has made difficult for perceiving and correcting errors. We propose to utilize another modality to provide information on occluded area. Spatial information on visual channel is transformed to temporal and frequency information on another modality. We use sound modality to illustrate the proposed trans-modality. Results show that performance with additional modality is better for drawing where the visual information is important than only with the visual feedback.
Year
DOI
Venue
2015
10.1145/2702613.2732817
CHI Extended Abstracts
Keywords
Field
DocType
miscellaneous,finger gesture,visual occlusion,touch gestural interface,sound feedback
Spatial analysis,Direct touch,Computer vision,Gesture,Computer science,Communication channel,Human–computer interaction,Artificial intelligence,Multimedia,Visual occlusion
Conference
Citations 
PageRank 
References 
0
0.34
10
Authors
5
Name
Order
Citations
PageRank
BoYu Gao123.09
HyungSeok Kim211622.09
Hasup Lee361.77
Jooyoung Lee457346.13
Jee-In Kim57219.78