Title
A Human-Centered Taxonomy Of Interaction Modalities And Devices
Abstract
In the field of human-computer interaction, taxonomies are used to classify and describe interaction (i.e. input and output) modalities, methods, technologies and devices. However, so far, most of these taxonomies and classification schemes consider only a subset of modalities and related methods, often reducing them to vision, audition and touch. Additionally, they are usually either technology- or task- instead of human-centered and thus vulnerable to rapid outdating with technological advancement and the advent of novel sensor and actor technologies. To tackle both problems, we propose a novel taxonomy that has been designed around the human and the human capabilities to sense output from and provide input to computer systems. We argue that although knowledge about the human sensory system might be up to changes as well, this process is considerably slower compared to technological advancement. Further, we reduce the taxonomy to what humans can actively and consciously sense or produce which is why novel findings related to human perception might not immediately compromise its validity. This article motivates the need for a novel taxonomy in the light of how computers and humans are able to perceive each other. It discusses existing taxonomies and introduces the new one that is intended to be (i) centered around the human and (ii) as holistic and timeless as possible. Further, the new taxonomy was evaluated with six human-computer interaction experts with regards to its practical use for researchers and different application scenarios.
Year
DOI
Venue
2019
10.1093/iwc/iwz003
INTERACTING WITH COMPUTERS
Keywords
Field
DocType
interaction modalities, interaction devices, taxonomy handling
Modalities,Computer science,Human–computer interaction,Multimedia
Journal
Volume
Issue
ISSN
31
1
0953-5438
Citations 
PageRank 
References 
1
0.34
5
Authors
2
Name
Order
Citations
PageRank
Mirjam Augstein12112.40
Thomas Neumayr2158.51