Title
Distributed Learning via Filtered Hyperinterpolation on Manifolds
Abstract
Learning mappings of data on manifolds is an important topic in contemporary machine learning, with applications in astrophysics, geophysics, statistical physics, medical diagnosis, biochemistry, and 3D object analysis. This paper studies the problem of learning real-valued functions on manifolds through filtered hyperinterpolation of input–output data pairs where the inputs may be sampled deterministically or at random and the outputs may be clean or noisy. Motivated by the problem of handling large data sets, it presents a parallel data processing approach which distributes the data-fitting task among multiple servers and synthesizes the fitted sub-models into a global estimator. We prove quantitative relations between the approximation quality of the learned function over the entire manifold, the type of target function, the number of servers, and the number and type of available samples. We obtain the approximation rates of convergence for distributed and non-distributed approaches. For the non-distributed case, the approximation order is optimal.
Year
DOI
Venue
2022
10.1007/s10208-021-09529-5
Foundations of Computational Mathematics
Keywords
DocType
Volume
Distributed learning, Filtered hyperinterpolation, Approximation on manifolds, Kernel methods, Numerical integration on manifolds, Quadrature rule, Random sampling, Gaussian white noise, 68W15, 58C05, 65D05, 68Q32, 42C15, 65T60, 41A50
Journal
22
Issue
ISSN
Citations 
4
1615-3375
0
PageRank 
References 
Authors
0.34
13
2
Name
Order
Citations
PageRank
Guido Montufar175.63
Yuguang Wang21198.13