Title | ||
---|---|---|
Convergence and Recovery Guarantees of the K-Subspaces Method for Subspace Clustering. |
Abstract | ||
---|---|---|
The K-subspaces (KSS) method is a generalization of the K-means method for subspace clustering. In this work, we present local convergence analysis and a recovery guarantee for KSS, assuming data are generated by the semi-random union of subspaces model, where $N$ points are randomly sampled from $K \ge 2$ overlapping subspaces. We show that if the initial assignment of the KSS method lies within a neighborhood of a true clustering, it converges at a superlinear rate and finds the correct clustering within $\Theta(\log\log N)$ iterations with high probability. Moreover, we propose a thresholding inner-product based spectral method for initialization and prove that it produces a point in this neighborhood. We also present numerical results of the studied method to support our theoretical developments. |
Year | Venue | DocType |
---|---|---|
2022 | International Conference on Machine Learning | Conference |
Citations | PageRank | References |
0 | 0.34 | 0 |
Authors | ||
4 |
Name | Order | Citations | PageRank |
---|---|---|---|
Peng Wang | 1 | 0 | 0.68 |
huikang liu | 2 | 5 | 2.11 |
Anthony Man-cho So | 3 | 1821 | 99.32 |
Laura Balzano | 4 | 19 | 3.43 |