Title
Non-Parametric Bayesian Subspace Models for Acoustic Unit Discovery
Abstract
This work investigates subspace non-parametric models for the task of learning a set of acoustic units from unlabeled speech recordings. We constrain the base-measure of a Dirichlet-Process mixture with a phonetic subspace-estimated from other source languages-to build an educated prior, thereby forcing the learned acoustic units to resemble phones of known source languages. Two types of models are proposed: (i) the Subspace HMM (SHMM) which assumes that the phonetic subspace is the same for every language, (ii) the Hierarchical-Subspace HMM (H-SHMM) which relaxes this assumption and allows to have a language-specific subspace estimated on the unlabeled target data. These models are applied on 3 languages: English, Yoruba and Mboshi and they are compared with various competitive acoustic units discovery baselines. Experimental results show that both subspace models outperform other systems in terms of clustering quality and segmentation accuracy. Moreover, we observe that the H-SHMM provides results superior to the SHMM supporting the idea that language-specific priors are preferable to language-agnostic priors for acoustic unit discovery.
Year
DOI
Venue
2022
10.1109/TASLP.2022.3171975
IEEE-ACM TRANSACTIONS ON AUDIO SPEECH AND LANGUAGE PROCESSING
Keywords
DocType
Volume
Hidden Markov models, Acoustics, Data models, Bayes methods, Speech recognition, Speech processing, Task analysis, Unsupervised learning, non- parametric Bayesian models, acoustic unit discovery
Journal
30
ISSN
Citations 
PageRank 
2329-9290
0
0.34
References 
Authors
0
4
Name
Order
Citations
PageRank
Lucas Ondel1357.16
Bolaji Yusuf213.72
Lukas Burget300.34
Murat Saraclar466962.91