Abstract | ||
---|---|---|
Most state-of-the-art subspace clustering methods only work with linear (or affine) subspaces. In this paper, we present a kernel subspace clustering method that can handle non-linear models. While an arbitrary kernel can non-linearly map data into high-dimensional Hilbert feature space, the data in the resulting feature space are very unlikely to have the desired subspace structures. By contrast, we propose to learn a low-rank kernel mapping, with which the mapped data in feature space are not only low-rank but also self-expressive, such that the low-dimensional subspace structures are present and manifested in the high-dimensional feature space. We have evaluated the proposed method extensively on both motion segmentation and image clustering benchmarks, and obtained superior results, outperforming the kernel subspace clustering method that uses standard kernels~cite{patel2014kernel} and other state-of-the-art linear subspace clustering methods. |
Year | Venue | Field |
---|---|---|
2017 | arXiv: Computer Vision and Pattern Recognition | Kernel (linear algebra),Pattern recognition,Correlation clustering,Kernel embedding of distributions,Random subspace method,Kernel principal component analysis,FLAME clustering,Artificial intelligence,Hyperplane,Cluster analysis,Mathematics |
DocType | Volume | Citations |
Journal | abs/1707.04974 | 2 |
PageRank | References | Authors |
0.36 | 2 | 5 |
Name | Order | Citations | PageRank |
---|---|---|---|
Pan Ji | 1 | 2 | 0.70 |
Ian Reid | 2 | 3 | 2.41 |
Ravi Garg | 3 | 2 | 0.36 |
Hongdong Li | 4 | 1724 | 101.81 |
Mathieu Salzmann | 5 | 1578 | 88.48 |