Title
Parametric nonlinear dimensionality reduction using kernel t-SNE.
Abstract
Novel non-parametric dimensionality reduction techniques such as t-distributed stochastic neighbor embedding (t-SNE) lead to a powerful and flexible visualization of high-dimensional data. One drawback of non-parametric techniques is their lack of an explicit out-of-sample extension. In this contribution, we propose an efficient extension of t-SNE to a parametric framework, kernel t-SNE, which preserves the flexibility of basic t-SNE, but enables explicit out-of-sample extensions. We test the ability of kernel t-SNE in comparison to standard t-SNE for benchmark data sets, in particular addressing the generalization ability of the mapping for novel data. In the context of large data sets, this procedure enables us to train a mapping for a fixed size subset only, mapping all data afterwards in linear time. We demonstrate that this technique yields satisfactory results also for large data sets provided missing information due to the small size of the subset is accounted for by auxiliary information such as class labels, which can be integrated into kernel t-SNE based on the Fisher information.
Year
DOI
Venue
2015
10.1016/j.neucom.2013.11.045
Neurocomputing
Keywords
Field
DocType
t-SNE,Dimensionality reduction,Visualization,Fisher information,Out-of-sample extension
Kernel (linear algebra),Embedding,Dimensionality reduction,Pattern recognition,Kernel embedding of distributions,Parametric statistics,Fisher information,Artificial intelligence,Nonlinear dimensionality reduction,Fisher kernel,Machine learning,Mathematics
Journal
Volume
ISSN
Citations 
147
0925-2312
21
PageRank 
References 
Authors
0.74
21
3
Name
Order
Citations
PageRank
Andrej Gisbrecht119515.60
Alexander Schulz2468.34
Barbara Hammer32383181.34