Title
A Fast Non-Smooth Nonnegative Matrix Factorization for Learning Sparse Representation.
Abstract
Nonnegative matrix factorization (NMF) is a hot topic in machine learning and data processing. Recently, a constrained version, non-smooth NMF (NsNMF), shows a great potential in learning meaningful sparse representation of the observed data. However, it suffers from a slow linear convergence rate, discouraging its applications to large-scale data representation. In this paper, a fast NsNMF (FNsNMF) algorithm is proposed to speed up NsNMF. In the proposed method, it first shows that the cost function of the derived sub-problem is convex and the corresponding gradient is Lipschitz continuous. Then, the optimization to this function is replaced by solving a proximal function, which is designed based on the Lipschitz constant and can be solved through utilizing a constructed fast convergent sequence. Due to the usage of the proximal function and its efficient optimization, our method can achieve a nonlinear convergence rate, much faster than NsNMF. Simulations in both computer generated data and the real-world data show the advantages of our algorithm over the compared methods.
Year
DOI
Venue
2016
10.1109/ACCESS.2016.2605704
IEEE ACCESS
Keywords
Field
DocType
Nonnegative matrix factorization,sparse representation,nonlinear convergence rate
Mathematical optimization,External Data Representation,Computer science,Sparse approximation,Matrix decomposition,Algorithm,Non-negative matrix factorization,Rate of convergence,Lipschitz continuity,Cuthill–McKee algorithm,Sparse matrix,Distributed computing
Journal
Volume
ISSN
Citations 
4
2169-3536
4
PageRank 
References 
Authors
0.42
20
5
Name
Order
Citations
PageRank
Zu-yuan Yang131224.12
Yu Zhang24010.46
Wei Yan33823.56
Yong Xiang4113793.92
Shengli Xie52530161.51