Title
StopWasting My Gradients: Practical SVRG
Abstract
We present and analyze several strategies for improving the performance ofstochastic variance-reduced gradient (SVRG) methods. We first show that theconvergence rate of these methods can be preserved under a decreasing sequenceof errors in the control variate, and use this to derive variants of SVRG that usegrowing-batch strategies to reduce the number of gradient calculations requiredin the early iterations. We further (i) show how to exploit support vectors to reducethe number of gradient computations in the later iterations, (ii) prove that thecommonly–used regularized SVRG iteration is justified and improves the convergencerate, (iii) consider alternate mini-batch selection strategies, and (iv) considerthe generalization error of the method.
Year
Venue
DocType
2015
NIPS
Conference
Citations 
PageRank 
References 
0
0.34
0
Authors
6
Name
Order
Citations
PageRank
Reza Harikandeh100.34
Mohamed Osama Ahmed272.01
Alim Virani3211.59
Mark W. Schmidt4129584.47
Jakub Konecný536319.21
scott sallinen6413.01