Abstract | ||
---|---|---|
Incremental learning allows incorporating new data in a classifier model without full retraining for computational efficiency. In this paper, we present two ways of performing incremental learning on Grassmann manifolds. In a Grassmann kernel learning framework, data are embedded on subspaces and kernels are constructed to map data subspaces to a projection space for classification. As new data samples become available, retraining degrades computational performance since Grassmann kernels need to be recomputed on larger matrices. We propose two computationally efficient techniques for incremental Grassmann kernel learning that achieve linear time complexity. We utilize the GROUSE framework to embed new data onto a pre-existing Grassmann manifold using Incremental Singular Value Decomposition (iSVD). Then we map the embeddings from a Grassmann space onto a projection space by exploiting the positive definite structure of Grassmann kernels and solving for principal angles of modified subspace pairs (iKernel). We show that our incremental learning approach is very effective in large systems and show examples for face recognition on standard datasets. |
Year | DOI | Venue |
---|---|---|
2017 | 10.1109/MWSCAS.2017.8053213 | Midwest Symposium on Circuits and Systems Conference Proceedings |
Keywords | Field | DocType |
Grassmann manifolds,incremental learning,incremental kernel PCA,face recognition | Computer science,Control engineering,Artificial intelligence,Time complexity,Manifold,Kernel (linear algebra),Singular value decomposition,Facial recognition system,Subspace topology,Algorithm,Linear subspace,Grassmannian,Machine learning | Conference |
ISSN | Citations | PageRank |
1548-3746 | 0 | 0.34 |
References | Authors | |
12 | 2 |
Name | Order | Citations | PageRank |
---|---|---|---|
Sherif Azary | 1 | 23 | 3.45 |
Andreas Savakis | 2 | 377 | 41.10 |