Title
Scaling Locally Linear Embedding.
Abstract
Locally Linear Embedding (LLE) is a popular approach to dimensionality reduction as it can effectively represent nonlinear structures of high-dimensional data. For dimensionality reduction, it computes a nearest neighbor graph from a given dataset where edge weights are obtained by applying the Lagrange multiplier method, and it then computes eigenvectors of the LLE kernel where the edge weights are used to obtain the kernel. Although LLE is used in many applications, its computation cost is significantly high. This is because, in obtaining edge weights, its computation cost is cubic in the number of edges to each data point. In addition, the computation cost in obtaining the eigenvectors of the LLE kernel is cubic in the number of data points. Our approach, Ripple, is based on two ideas: (1) it incrementally updates the edge weights by exploiting the Woodbury formula and (2) it efficiently computes eigenvectors of the LLE kernel by exploiting the LU decomposition-based inverse power method. Experiments show that Ripple is significantly faster than the original approach of LLE by guaranteeing the same results of dimensionality reduction.
Year
DOI
Venue
2017
10.1145/3035918.3064021
SIGMOD Conference
Field
DocType
Citations 
Data point,Kernel (linear algebra),Data mining,Mathematical optimization,Dimensionality reduction,Embedding,Computer science,Algorithm,Nearest neighbor graph,Woodbury matrix identity,LU decomposition,Inverse iteration
Conference
2
PageRank 
References 
Authors
0.38
18
7
Name
Order
Citations
PageRank
Yasuhiro Fujiwara129225.43
Naoki Marumo240.74
Mathieu Blondel34055174.33
Koh Takeuchi45911.29
Hideaki Kim5496.39
Tomoharu Iwata682465.87
Naonori Ueda71902214.32