Title
A Learning-Rate Schedule for Stochastic Gradient Methods to Matrix Factorization.
Abstract
Stochastic gradient methods are effective to solve matrix factorization problems. However, it is well known that the performance of stochastic gradient method highly depends on the learning rate schedule used; a good schedule can significantly boost the training process. In this paper, motivated from past works on convex optimization which assign a learning rate for each variable, we propose a new schedule for matrix factorization. The experiments demonstrate that the proposed schedule leads to faster convergence than existing ones. Our schedule uses the same parameter on all data sets included in our experiments; that is, the time spent on learning rate selection can be significantly reduced. By applying this schedule to a state-of-the-art matrix factorization package, the resulting implementation outperforms available parallel matrix factorization packages.
Year
DOI
Venue
2015
10.1007/978-3-319-18038-0_35
ADVANCES IN KNOWLEDGE DISCOVERY AND DATA MINING, PART I
Keywords
Field
DocType
Matrix factorization,Stochastic gradient method,Learning rate schedule
Convergence (routing),Mathematical optimization,Data set,Incomplete Cholesky factorization,Computer science,Matrix decomposition,Stochastic gradient method,Incomplete LU factorization,Convex optimization
Conference
Volume
ISSN
Citations 
9077
0302-9743
20
PageRank 
References 
Authors
1.02
10
4
Name
Order
Citations
PageRank
Wei-Sheng Chin12368.76
Yong Zhuang225413.88
Yu-Chin Juan32529.54
Chih-Jen Lin4202861475.84