Title
Megaman: Scalable Manifold Learning in Python.
Abstract
Manifold Learning (M L) is a class of algorithms seeking a low-dimensional non-linear representation of high-dimensional data. Thus, ML algorithms are most applicable to highdimensional data and require large sample sizes to accurately estimate the manifold. Despite this, most existing manifold learning implementations are not particularly scalable. Here we present a Python package that implements a variety of manifold learning algorithms in a modular and scalable fashion, using fast approximate neighbors searches and fast sparse eigendecompositions. The package incorporates theoretical advances in manifold learning, such as the unbiased Laplacian estimator introduced by Coifman and Lafon (2006) and the estimation of the embedding distortion by the Riemannian metric method introduced by Perrault-Joncas and Meila (2013). In benchmarks, even on a single-core desktop computer, our code embeds millions of data points in minutes, and takes just 200 minutes to embed the main sample of galaxy spectra from the Sloan Digital Sky Survey consisting of 0.6 million samples in 3750-dimensions a task which has not previously been possible.
Year
Venue
Keywords
2016
JOURNAL OF MACHINE LEARNING RESEARCH
manifold learning,dimension reduction,Riemannian metric,graph embedding,scalable methods,python
Field
DocType
Volume
Data point,Embedding,Dimensionality reduction,Graph embedding,Computer science,Theoretical computer science,Manifold alignment,Artificial intelligence,Nonlinear dimensionality reduction,Machine learning,Python (programming language),Manifold
Journal
17
ISSN
Citations 
PageRank 
1532-4435
1
0.35
References 
Authors
0
4
Name
Order
Citations
PageRank
James M. McQueen15910.02
Marina Meila21809213.25
Jake Vanderplas33979167.73
Zhongyue Zhang410.35