Title
Robust Differentiable SVD
Abstract
Eigendecomposition of symmetric matrices is at the heart of many computer vision algorithms. However, the derivatives of the eigenvectors tend to be numerically unstable, whether using the SVD to compute them analytically or using the Power Iteration (PI) method to approximate them. This instability arises in the presence of eigenvalues that are close to each other. This makes integrating eigendecomposition into deep networks difficult and often results in poor convergence, particularly when dealing with large matrices. While this can be mitigated by partitioning the data into small arbitrary groups, doing so has no theoretical basis and makes it impossible to exploit the full power of eigendecomposition. In previous work, we mitigated this using SVD during the forward pass and PI to compute the gradients during the backward pass. However, the iterative deflation procedure required to compute multiple eigenvectors using PI tends to accumulate errors and yield inaccurate gradients. Here, we show that the Taylor expansion of the SVD gradient is theoretically equivalent to the gradient obtained using PI without relying in practice on an iterative process and thus yields more accurate gradients. We demonstrate the benefits of this increased accuracy for image classification and style transfer.
Year
DOI
Venue
2022
10.1109/TPAMI.2021.3072422
IEEE Transactions on Pattern Analysis and Machine Intelligence
Keywords
DocType
Volume
Eigendecomposition,differentiable SVD,power iteration,taylor expansion
Journal
44
Issue
ISSN
Citations 
9
0162-8828
0
PageRank 
References 
Authors
0.34
25
5
Name
Order
Citations
PageRank
Wei Wang113114.16
Zheng Dang200.34
Yinlin Hu3275.43
Pascal Fua412768731.45
Mathieu Salzmann5157888.48