Title
Globally Convergent Stochastic Optimization for Canonical Correlation Analysis.
Abstract
We study the stochastic optimization of canonical correlation analysis (CCA), whose objective is nonconvex and does not decouple over training samples. Although several stochastic optimization algorithms have been recently proposed to solve this problem, no global convergence guarantee was provided by any of them. Based on the alternating least squares formulation of CCA, we propose a globally convergent stochastic algorithm, which solves the resulting least squares problems approximately to sufficient accuracy with state-of-the-art stochastic gradient methods for convex optimization. We provide the overall time complexity of our algorithm which significantly improves upon that of previous work. Experimental results demonstrate the superior performance of our algorithm.
Year
Venue
Field
2016
arXiv: Learning
Least squares,Convergence (routing),Mathematical optimization,Stochastic optimization,Canonical correlation,Artificial intelligence,Time complexity,Alternating least squares,Convex optimization,Machine learning,Mathematics
DocType
Volume
Citations 
Journal
abs/1604.01870
2
PageRank 
References 
Authors
0.38
12
3
Name
Order
Citations
PageRank
Weiran Wang11149.99
Jialei Wang27710.29
Nathan Srebro33892349.42