Title
A Multimodal-Sensor-Enabled Room for Unobtrusive Group Meeting Analysis.
Abstract
Group meetings can suffer from serious problems that undermine performance, including bias, "groupthink", fear of speaking, and unfocused discussion. To better understand these issues, propose interventions, and thus improve team performance, we need to study human dynamics in group meetings. However, this process currently heavily depends on manual coding and video cameras. Manual coding is tedious, inaccurate, and subjective, while active video cameras can affect the natural behavior of meeting participants. Here, we present a smart meeting room that combines microphones and unobtrusive ceiling-mounted Time-of-Flight (ToF) sensors to understand group dynamics in team meetings. We automatically process the multimodal sensor outputs with signal, image, and natural language processing algorithms to estimate participant head pose, visual focus of attention (VFOA), non-verbal speech patterns, and discussion content. We derive metrics from these automatic estimates and correlate them with user-reported rankings of emergent group leaders and major contributors to produce accurate predictors. We validate our algorithms and report results on a new dataset of lunar survival tasks of 36 individuals across 10 groups collected in the multimodal-sensor-enabled smart room.
Year
DOI
Venue
2018
10.1145/3242969.3243022
ICMI
Keywords
Field
DocType
Multimodal sensing, smart rooms, time-of-flight sensing, head pose estimation, natural language processing, meeting summarization, group meeting analysis
Computer science,Human dynamics,Coding (social sciences),Human–computer interaction,Meeting analysis,Smart rooms
Conference
ISBN
Citations 
PageRank 
978-1-4503-5692-3
1
0.35
References 
Authors
26
10
Name
Order
Citations
PageRank
Indrani Bhattacharya143.77
Michael Foley211.70
Ni Zhang311.03
Tongtao Zhang4356.78
Christine Ku510.69
Cameron Mine610.69
Heng Ji71544127.27
Christoph Riedl832119.82
Brooke Foucault Welles9283.40
Richard J. Radke10128978.89