Abstract | ||
---|---|---|
In automatic speech recognition, a common method to decorrelate features and to reduce feature space dimensionality is Linear Discriminant Analysis (LDA). In this paper, the performance of LDA has been compared with other linear feature space transformation schemes, as many alternative methods have been suggested and lead to higher recognition accuracy in some cases. Different approaches such as MLLT, HLDA, SHLDA, PCA, and combined schemes were implemented and compared. Experiments show that all methods lead to similar results.In addition, recent research has shown that the LDA algorithm is unreliable if the input features of LDA are strongly correlated. In this paper a stable solution to the correlated feature problem, consisting of a concatenation scheme with PCA and LDA, is proposed and verified. Finally, several transformation algorithms are evaluated on uncorrelated and strongly correlated features. |
Year | DOI | Venue |
---|---|---|
2008 | 10.1007/978-3-540-69369-7_20 | PIT |
Keywords | Field | DocType |
input feature,linear discriminant analysis,automatic speech recognition,linear feature space transformation,lda algorithm,correlated feature,transformation algorithm,higher recognition accuracy,correlated features,linear feature space transformations,feature space dimensionality,correlated feature problem,feature space | Feature vector,Pattern recognition,Computer science,Uncorrelated,Speech recognition,Curse of dimensionality,Artificial intelligence,Concatenation,Linear discriminant analysis | Conference |
Volume | ISSN | Citations |
5078 | 0302-9743 | 0 |
PageRank | References | Authors |
0.34 | 4 | 4 |
Name | Order | Citations | PageRank |
---|---|---|---|
Daniel Vásquez | 1 | 0 | 0.34 |
Rainer Gruhn | 2 | 45 | 6.86 |
Raymond Brueckner | 3 | 29 | 3.68 |
Wolfgang Minker | 4 | 619 | 108.61 |