Title
Can you 'read' tongue movements? Evaluation of the contribution of tongue display to speech understanding
Abstract
Lip reading relies on visible articulators to ease speech understanding. However, lips and face alone provide very incomplete phonetic information: the tongue, that is generally not entirely seen, carries an important part of the articulatory information not accessible through lip reading. The question is thus whether the direct and full vision of the tongue allows tongue reading. We have therefore generated a set of audiovisual VCV stimuli with an audiovisual talking head that can display all speech articulators, including tongue, in an augmented speech mode. The talking head is a virtual clone of a human speaker and the articulatory movements have also been captured on this speaker using ElectroMagnetic Articulography (EMA). These stimuli have been played to subjects in audiovisual perception tests in various presentation conditions (audio signal alone, audiovisual signal with profile cutaway display with or without tongue, complete face), at various Signal-to-Noise Ratios. The results indicate: (1) the possibility of implicit learning of tongue reading, (2) better consonant identification with the cutaway presentation with the tongue than without the tongue, (3) no significant difference between the cutaway presentation with the tongue and the more ecological rendering of the complete face, (4) a predominance of lip reading over tongue reading, but (5) a certain natural human capability for tongue reading when the audio signal is strongly degraded or absent. We conclude that these tongue reading capabilities could be used for applications in the domains of speech therapy for speech retarded children, of perception and production rehabilitation of hearing impaired children, and of pronunciation training for second language learners.
Year
DOI
Venue
2010
10.1016/j.specom.2010.03.002
Speech Communication
Keywords
Field
DocType
lip reading,tongue reading,augmented speech.,cutaway presentation,complete face,tongue reading capability,audiovisual speech perception,tongue display,speech understanding,hearing losses,augmented speech,virtual audiovisual talking head,speech retarded child,augmented speech mode,speech therapy,audio signal,electromagnetic articulography (ema),tongue movement,speech articulators,signal to noise ratio,vision,comprehension,sciences,identification,cued speech,statistical significance,white noise,augmented reality,production,evaluation,speech perception,face,information theory
Pronunciation,Audio signal,Consonant,Computer science,Second language,Implicit learning,Speech recognition,Stimulus (physiology),Perception,Tongue
Journal
Volume
Issue
ISSN
52
6
Speech Communication
Citations 
PageRank 
References 
22
1.71
11
Authors
4
Name
Order
Citations
PageRank
pierre badin123235.83
Yuliya Tarabalka290747.12
Frédéric Elisei327525.05
Gérard Bailly460999.37