Title
Speech and multimodal interaction in mobile GIS search: a case of study
Abstract
The In this short paper we will present an Android mobile application making use of a multimodal interface. We base our application on a proprietary architecture based on the W3C recommendation MM-Framework. App-users can talk and use natural gestures on a mobile device touch screen in order to formulate a complex interrogation to a geo-referenced web service. The interaction produce as result a query whose origin is a semantically incomplete audio sentence complete by a deictic gesture (i.e.: "please, find all bus stops in this area - - while tapping or making a circle on the screen where a map is showed - -").
Year
DOI
Venue
2012
10.1007/978-3-642-29247-7_3
W2GIS
Keywords
Field
DocType
natural gesture,complex interrogation,w3c recommendation,mobile gis search,multimodal interface,mobile device touch screen,geo-referenced web service,multimodal interaction,proprietary architecture,android mobile application,deictic gesture,bus stop
Multimodal interaction,Mobile search,Android (operating system),Gesture,Computer science,Mobile device,Human–computer interaction,Deixis,Web service,Multimedia,Sentence,Database
Conference
Citations 
PageRank 
References 
1
0.40
4
Authors
4
Name
Order
Citations
PageRank
Francesco Cutugno17618.01
Vincenza Anna Leano292.26
Gianluca Mignini391.25
Roberto Rinaldi411010.45