Title
Voice activity detection from gaze in video mediated communication
Abstract
This paper discusses estimation of active speaker in multi-party video-mediated communication from gaze data of one of the participants. In the explored settings, we predict voice activity of participants in one room based on gaze recordings of a single participant in another room. The two rooms were connected by high definition, low delay audio and video links and the participants engaged in different activities ranging from casual discussion to simple problem-solving games. We treat the task as a classification problem. We evaluate several types of features and parameter settings in the context of Support Vector Machine classification framework. The results show that using the proposed approach vocal activity of a speaker can be correctly predicted in 89 % of the time for which the gaze data are available.
Year
DOI
Venue
2012
10.1145/2168556.2168628
ETRA
Keywords
Field
DocType
voice activity,video mediated communication,low delay audio,vocal activity,support vector machine classification,different activity,active speaker,high definition,casual discussion,classification problem,explored setting,voice activity detection,support vector machines,machine learning,support vector machine
Computer vision,High definition,Gaze,Voice activity detection,Computer science,Support vector machine,Speech recognition,Ranging,Artificial intelligence,Casual,Low delay,Video mediated communication
Conference
Citations 
PageRank 
References 
3
0.47
7
Authors
3
Name
Order
Citations
PageRank
Michal Hradis113214.19
Shahram Eivazi2335.31
Roman Bednarik356148.77