Title
Are Real Tongue Movements Easier To Speech Read Than Synthesized?
Abstract
Speech perception studies with augmented reality displays in talking heads have shown that tongue reading abilities are weak initially, but that subjects become able to extract some information from intra-oral visualizations after a short training session. In this study, we investigate how the nature of the tongue movements influences the results, by comparing synthetic rule-based and actual, measured movements. The subjects were significantly better at perceiving sentences accompanied by real movements, indicating that the current coarticulation model developed for facial movements is not optimal for the tongue.
Year
Venue
Keywords
2009
INTERSPEECH 2009: 10TH ANNUAL CONFERENCE OF THE INTERNATIONAL SPEECH COMMUNICATION ASSOCIATION 2009, VOLS 1-5
multimodal speech perception, augmented reality, visual speech synthesis
Field
DocType
Citations 
Computer science,Communication studies,Augmented reality,Speech recognition,Coarticulation,Speech perception,Tongue
Conference
0
PageRank 
References 
Authors
0.34
6
2
Name
Order
Citations
PageRank
Olov Engwall119730.71
Preben Wik28511.72