Title
Modeling Cognitive Processes from Multimodal Signals.
Abstract
Multimodal signals allow us to gain insights into internal cognitive processes of a person, for example: speech and gesture analysis yields cues about hesitations, knowledgeability, or alertness, eye tracking yields information about a person's focus of attention, task, or cognitive state, EEG yields information about a person's cognitive load or information appraisal. Capturing cognitive processes is an important research tool to understand human behavior as well as a crucial part of a user model to an adaptive interactive system such as a robot or a tutoring system. As cognitive processes are often multifaceted, a comprehensive model requires the combination of multiple complementary signals. In this workshop at the ACM International Conference on Multimodal Interfaces (ICMI) conference in Boulder, Colorado, USA, we discussed the state-of-the-art in monitoring and modeling cognitive processes from multi-modal signals.
Year
DOI
Venue
2018
10.1145/3242969.3265861
ICMI
Field
DocType
ISBN
Multimodality,Computer science,Eye tracking,Human–computer interaction,User modeling,Robot,Cognition,Cognitive load,Electroencephalography,Alertness
Conference
978-1-4503-5692-3
Citations 
PageRank 
References 
0
0.34
0
Authors
6
Name
Order
Citations
PageRank
Felix Putze120529.73
Jutta Hild2226.19
Akane Sano3848.96
Enkelejda Kasneci420233.86
Erin T. Solovey573048.09
T. Schultz62423252.72