Title
Dual Principal Component Pursuit - Improved Analysis and Efficient Algorithms.
Abstract
Recent methods for learning a linear subspace from data corrupted by outliers are based on convex l(1) and nuclear norm optimization and require the dimension of the subspace and the number of outliers to be sufficiently small [27]. In sharp contrast, the recently proposed Dual Principal Component Pursuit (DPCP) method [22] can provably handle subspaces of high dimension by solving a non-convex l(1) optimization problem on the sphere. However, its geometric analysis is based on quantities that are difficult to interpret and are not amenable to statistical analysis. In this paper we provide a refined geometric analysis and a new statistical analysis that show that DPCP can tolerate as many outliers as the square of the number of inliers, thus improving upon other provably correct robust PCA methods. We also propose a scalable Projected Sub-Gradient Method (DPCP-PSGM) for solving the DPCP problem and show that it achieves linear convergence even though the underlying optimization problem is non-convex and non-smooth. Experiments on road plane detection from 3D point cloud data demonstrate that DPCP-PSGM can be more efficient than the traditional RANSAC algorithm, which is one of the most popular methods for such computer vision applications.
Year
Venue
Keywords
2018
ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018)
efficient algorithms,optimization problem,linear convergence,statistical analysis,linear subspace,geometric analysis
Field
DocType
Volume
Subgradient method,Subspace topology,RANSAC,Computer science,Geometric analysis,Algorithm,Matrix norm,Robust principal component analysis,Linear subspace,Optimization problem
Conference
31
ISSN
Citations 
PageRank 
1049-5258
3
0.38
References 
Authors
0
6
Name
Order
Citations
PageRank
Zhihui Zhu112125.37
Wang, Yifan2321.52
Daniel P. Robinson326121.51
D. Q. Naiman430.72
rene victor valqui vidal55331260.14
Manolis C. Tsakiris6509.79