Title
Multilevel auditory displays for mobile eyes-free location-based interaction
Abstract
This paper explores the use of multilevel auditory displays to enable eyes-free mobile interaction with location-based information in a conceptual art exhibition space. Multilevel auditory displays enable user interaction with concentrated areas of information. However, it is necessary to consider how to present the auditory streams without overloading the user. We present an initial study in which a top-level exocentric sonification layer was used to advertise information present in a gallery-like space. Then, in a secondary interactive layer, three different conditions were evaluated that varied in the presentation (sequential versus simultaneous) and spatialisation (non-spatialised versus egocentric spatialisation) of multiple auditory sources. Results show that 1) participants spent significantly more time interacting with spatialised displays, 2) there was no evidence that a switch from an exocentric to an egocentric display increased workload or lowered satisfaction, and 3) there was no evidence that simultaneous presentation of spatialised Earcons in the secondary display increased workload.
Year
DOI
Venue
2014
10.1145/2559206.2581254
CHI Extended Abstracts
Keywords
Field
DocType
gallery-like space,multilevel auditory display,location-based information,multiple auditory source,conceptual art exhibition space,eyes-free mobile interaction,egocentric display,information present,mobile eyes-free location-based interaction,auditory stream,secondary display
Workload,Computer science,Endocentric and exocentric,Human–computer interaction,Sonification,Auditory display,Mobile interaction,Multimedia
Conference
Citations 
PageRank 
References 
0
0.34
10
Authors
5
Name
Order
Citations
PageRank
Yolanda Vazquez-Alvarez111511.67
Matthew P. Aylett2936237.25
Stephen Brewster34913474.60
Rocio von Jungenfeld401.01
Antti Virolainen5283.39