Title
Manual evaluation of synthesised sign language avatars
Abstract
The evaluation discussed in this paper explores the role that underlying facial expressions might have regarding understandability in sign language avatars. Focusing specifically on Irish Sign Language (ISL), we examine the Deaf community's appetite for sign language avatars. The work presented explores the following hypothesis: Augmenting an existing avatar with various combinations of the 7 widely accepted universal emotions identified by Ekman [1] to achieve underlying facial expressions, will make that avatar more human-like and consequently improve usability and understandability for the ISL user. Using human evaluation methods [2] we compare an augmented set of avatar utterances against a baseline set, focusing on two key areas: comprehension and naturalness of facial configuration. We outline our approach to the evaluation including our choice of ISL participants, interview environment and evaluation methodology.
Year
DOI
Venue
2013
10.1145/2513383.2513420
ASSETS
Keywords
Field
DocType
facial configuration,isl participant,synthesised sign language avatar,manual evaluation,isl user,evaluation methodology,avatar utterance,sign language avatar,facial expression,existing avatar,underlying facial expression,human evaluation method,emotion,user centered design,accessibility,sign language,hci
Computer science,Deaf community,Usability,Naturalness,Facial expression,Human–computer interaction,Sign language,Avatar,Multimedia,Comprehension,User-centered design
Conference
Citations 
PageRank 
References 
1
0.39
1
Authors
2
Name
Order
Citations
PageRank
Robert Smith117323.32
Brian Nolan2103.19