Title
AUTOMATIC DIFFERENTIATION FOR RIEMANNIAN OPTIMIZATION ON LOW-RANK MATRIX AND TENSOR-TRAIN MANIFOLDS
Abstract
In scientific computing and machine learning applications, matrices and more general multidimensional arrays (tensors) can often be approximated with the help of low-rank decompositions. Since matrices and tensors of fixed rank form smooth Riemannian manifolds, one of the popular tools for finding low-rank approximations is to use Riemannian optimization. Nevertheless, efficient implementation of Riemannian gradients and Hessians, required in Riemannian optimization algorithms, can be a nontrivial task in practice. Moreover, in some cases, analytic formulas are not even available. In this paper, we build upon automatic differentiation and propose a method that, given an implementation of the function to be minimized, efficiently computes Riemannian gradients and matrix-by-vector products between an approximate Riemannian Hessian and a given vector.
Year
DOI
Venue
2022
10.1137/20M1356774
SIAM JOURNAL ON SCIENTIFIC COMPUTING
Keywords
DocType
Volume
automatic differentiation, Riemannian optimization, low-rank approximation, tensor-train decomposition
Journal
44
Issue
ISSN
Citations 
2
1064-8275
0
PageRank 
References 
Authors
0.34
0
3
Name
Order
Citations
PageRank
Alexander Novikov100.34
Maxim Rakhuba201.69
Ivan V. Oseledets330641.96