Title
Fast Hybrid Algorithm for Big Matrix Recovery.
Abstract
Large-scale Nuclear Norm penalized Least Square problem (NNLS) is frequently encountered in estimation of low rank structures. In this paper we accelerate the solution procedure by combining non-smooth convex optimization with smooth Riemannian method. Our methods comprise of two phases. In the first phase, we use Alternating Direction Method of Multipliers (ADMM) both to identify the fix rank manifold where an optimum resides and to provide an initializer for the subsequent refinement. In the second phase, two super-linearly convergent Riemannian methods: Riemannian NewTon (NT) and Riemannian Conjugate Gradient descent (CG) are adopted to improve the approximation over a fix rank manifold. We prove that our Hybrid method of ADMM and NT (HADMNT) converges to an optimum of NNLS at least quadratically. The experiments on large-scale collaborative filtering datasets demonstrate very competitive performance of these fast hybrid methods compared to the state-of-the-arts.
Year
Venue
Field
2016
AAAI
Conjugate gradient method,Least squares,Mathematical optimization,Hybrid algorithm,Computer science,Matrix (mathematics),Matrix norm,Artificial intelligence,Initialization,Convex optimization,Manifold,Machine learning
DocType
Citations 
PageRank 
Conference
0
0.34
References 
Authors
20
4
Name
Order
Citations
PageRank
Tengfei Zhou1225.08
Hui Qian25913.26
Zebang Shen3179.36
Congfu Xu413214.31