Abstract | ||
---|---|---|
We present a probabilistic subspace clustering approach that is capable of rapidly clustering very large signal collections. Each signal is represented by a sparse combination of basis elements (atoms), which form the columns of a dictionary matrix. The set of sparse representations is utilized to derive the co-occurrences matrix of atoms and signals, which is modeled as emerging from a mixture model. The components of the mixture model are obtained via a non-negative matrix factorization (NNMF) of the co-occurrences matrix, and the subspace of each signal is estimated according to a maximum-likelihood (ML) criterion. Performance evaluation demonstrate comparable clustering accuracies to state-of-the-art at a fraction of the computational load. |
Year | DOI | Venue |
---|---|---|
2013 | 10.1109/LSP.2012.2229705 | IEEE Signal Process. Lett. |
Keywords | Field | DocType |
signal representation,pattern clustering,signal estimation,dictionary,sparse matrices,maximum likelihood estimation,signal collections,basis elements,probabilistic subspace clustering,nnmf,dictionary matrix,computational load,subspace clustering,cooccurrences matrix,nonnegative matrix factorization,matrix decomposition,ml criterion,performance evaluation,non-negative matrix factorization,clustering accuracy,mixture model,aspect model,sparse combination,sparse representation,maximum-likelihood criterion,probability,sparse representations | K-SVD,Correlation clustering,Pattern recognition,Sparse approximation,Matrix decomposition,Non-negative matrix factorization,Artificial intelligence,Biclustering,Cluster analysis,Sparse matrix,Mathematics | Journal |
Volume | Issue | ISSN |
20 | 1 | 1070-9908 |
Citations | PageRank | References |
17 | 0.66 | 9 |
Authors | ||
3 |
Name | Order | Citations | PageRank |
---|---|---|---|
Amir Adler | 1 | 96 | 8.81 |
Michael Elad | 2 | 11274 | 854.93 |
Yacov Hel-Or | 3 | 461 | 40.74 |