Abstract | ||
---|---|---|
Manifold learning can perform nonlinear dimensionality reduction in the high dimensional space. ISOMAP, LLE, Laplacian Eigenmaps, LTSA and Multilayer autoencoders are representative algorithms. Most of them are only defined on the training sets and are executed as a batch mode. They don't provide a model or a formula to map the new data into the low dimensional space. In this paper, we proposed an incremental manifold learning algorithm based on the small world model, which generalizes ISOMAP to new samples. At first, k nearest neighbors and some faraway points are selected from the training set for each new sample. Then the low dimensional embedding of the new sample is obtained by preserving the geodesic distances between it and those points. Experiments demonstrate that new samples can effectively be projected into the low dimensional space with the presented method and the algorithm has lower complexity. |
Year | DOI | Venue |
---|---|---|
2010 | 10.1007/978-3-642-15621-2_36 | LSMS/ICSEE (1) |
Keywords | Field | DocType |
representative algorithm,low dimensional,manifold learning,training set,new sample,small world model,high dimensional space,new data,incremental manifold,low dimensional space,geodesic distance,nonlinear dimensionality reduction,k nearest neighbor | k-nearest neighbors algorithm,Embedding,Dimensionality reduction,Pattern recognition,Algorithm,Manifold alignment,Artificial intelligence,Nonlinear dimensionality reduction,Geodesic,Mathematics,Isomap,Laplace operator | Conference |
Volume | ISSN | ISBN |
6328 | 0302-9743 | 3-642-15620-7 |
Citations | PageRank | References |
3 | 0.47 | 5 |
Authors | ||
5 |
Name | Order | Citations | PageRank |
---|---|---|---|
Lukui Shi | 1 | 11 | 4.33 |
Qingxin Yang | 2 | 4 | 2.84 |
Enhai Liu | 3 | 10 | 2.64 |
Li Jianwei | 4 | 114 | 17.71 |
Yongfeng Dong | 5 | 8 | 1.02 |