Title
Speech-based interaction in multitask conditions: impact of prompt modality.
Abstract
Speech-based interaction is often recognized as appropriate for hands-busy, eyes-busy multitask situations. The objective of this study was to explore prompt-guided speech-based interaction and the impact of prompt modality on overall performance in such situations. A dual-task paradigm was employed, with tracking as a primary task and speech-based data input as a secondary task. There were three tracking conditions: no tracking, basic, and difficult tracking. Two prompt modalities were used for the speech interaction: a dialogue with spoken prompts and a dialogue with visual prompts. Data entry duration was longer with the speech prompts than with the visual prompts, regardless of whether or not there was tracking or its level of difficulty. However, when tracking was difficult, data entry duration was similar for both spoken and visual prompts. Tracking performance was also affected by the prompt modality, with poorer performance obtained when the prompts were visual. The findings are discussed in terms of multiple resource theory and the possible implications for speech-based interactions in multitask situations. Actual or potential applications of this research include the design of speech-based dialogues for multitask situations such as driving and other hands-busy, eyes-busy situations.
Year
DOI
Venue
2005
10.1518/001872005774860041
HUMAN FACTORS
Keywords
Field
DocType
human factors engineering,motor skills,speech,multitasking
Interaction design,Simulation,Workload,Computer science,Subject-matter expert,Navigation system,Cognitive psychology,Human–computer interaction,Resource allocation,User interface,Human multitasking,Resource dependence theory
Journal
Volume
Issue
ISSN
47
3
0018-7208
Citations 
PageRank 
References 
6
0.76
10
Authors
1
Name
Order
Citations
PageRank
Avi Parush119822.17