Title
Nonlinear Unsupervised Feature Learning: How Local Similarities Lead to Global Coding
Abstract
This paper introduces a novel coding scheme based on the diffusion map framework. The idea is to run a t-step random walk on the data graph to capture the similarity of a data point to the codebook atoms. By doing this we exploit local similarities extracted from the data structure to obtain a global similarity which takes into account the non-linear structure of the data. Unlike the locality-based and sparse coding methods, the proposed coding varies smoothly with respect to the underlying manifold. We extend the above transductive approach to an inductive variant which is of great interest for large scale datasets. We also present a method for codebook generation by coarse graining the data graph with the aim of preserving random walks. Experiments on synthetic and real data sets demonstrate the superiority of the proposed coding scheme over the state-of-the-art coding techniques especially in a semi-supervised setting where the number of labeled data is small.
Year
DOI
Venue
2012
10.1109/ICDMW.2012.86
Data Mining Workshops
Keywords
Field
DocType
data graph,sparse coding method,data point,state-of-the-art coding technique,codebook generation,proposed coding scheme,data structure,nonlinear unsupervised feature learning,codebook atom,global coding,proposed coding,local similarities,encoding,data structures,graph theory,unsupervised learning
Graph theory,Transduction (machine learning),Data structure,Pattern recognition,Computer science,Neural coding,Coding (social sciences),Unsupervised learning,Artificial intelligence,Machine learning,Feature learning,Codebook
Conference
ISSN
ISBN
Citations 
2375-9232
978-1-4673-5164-5
0
PageRank 
References 
Authors
0.34
8
4
Name
Order
Citations
PageRank
Amirreza Shaban1485.60
Hamid R. Rabiee233641.77
Marzieh S. Tahaei300.34
Erfan Salavati400.34