Title
Determining Iconic Gesture Forms based on Entity Image Representation.
Abstract
Iconic gestures are used to depict physical objects mentioned in speech, and the gesture form is assumed to be based on the image of a given object in the speaker’s mind. Using this idea, this study proposes a model that learns iconic gesture forms from an image representation obtained from pictures of physical entities. First, we collect a set of pictures of each entity from the web, and create an average image representation from them. Subsequently, the average image representation is fed to a fully connected neural network to decide the gesture form. In the model evaluation experiment, our two-step gesture form selection method can classify seven types of gesture forms with over 62% accuracy. Furthermore, we demonstrate an example of gesture generation in a virtual agent system in which our model is used to create a gesture dictionary that assigns a gesture form for each entry word in the dictionary.
Year
DOI
Venue
2019
10.1145/3340555.3353736
ICMI
Keywords
Field
DocType
Gesture generation, Iconic gesture, Image representation, Deep neural network
Computer vision,Gesture,Computer science,Image representation,Human–computer interaction,Artificial intelligence
Conference
ISBN
Citations 
PageRank 
978-1-4503-6860-5
0
0.34
References 
Authors
0
4
Name
Order
Citations
PageRank
Fumio Nihei1184.52
Yukiko Nakano250162.37
Ryuichiro Higashinaka334147.27
Ryo Ishii415516.59