Title
Reconstructing multiparty conversation field by augmenting human head motions via dynamic displays
Abstract
A novel system is presented for reconstructing multiparty face-to-face conversation scenes in the real world through the use of dynamic displays that augment human head motion. This system aims to display and playback recorded conversations as if the remote people were talking in front of the viewer. It consists of multiple projectors and transparent screens attached to actuators. The screens displaying the life-size faces are spatially arranged to recreate the actual scene. Screen pose is dynamically synchronized to the actual head motions of the participants to emulate their head motions, which typically indicate shifts in visual attention. Our hypothesis is that physical screen motion with image motion can boost the viewer's understanding of others' visual attention. Experiments suggest that viewers can more clearly discern the attention of meeting participants, and more accurately identify the addressees.
Year
DOI
Venue
2012
10.1145/2212776.2223783
CHI Extended Abstracts
Keywords
Field
DocType
actual head motion,transparent screen,human head motion,image motion,novel system,head motion,multiparty conversation field,dynamic display,actual scene,physical screen motion,visual attention,multimodal interaction
Computer vision,Projection mapping,Conversation,Computer science,Image motion,Human–computer interaction,Visual attention,Artificial intelligence,Multimedia,Face to face conversation,Human head
Conference
Citations 
PageRank 
References 
2
0.41
3
Authors
5
Name
Order
Citations
PageRank
Kazuhiro Otsuka161954.15
Shiro Kumano214916.82
Dan Mikami311817.60
Masafumi Matsuda4415.00
Junji Yamato51120165.72