Title
Generalization Properties Of Hyper-Rkhs And Its Applications
Abstract
This paper generalizes regularized regression problems in a hyper-reproducing kernel Hilbert space (hyper-RKHS), illustrates its utility for kernel learning and out-of-sample extensions, and proves asymptotic convergence results for the introduced regression models in an approximation theory view. Algorithmically, we consider two regularized regression models with bivariate forms in this space, including kernel ridge regression (KRR) and support vector regression (SVR) endowed with hyper-RKHS, and further combine divide-and-conquer with Nystrum approximation for scalability in large sample cases. This framework is general: the underlying kernel is learned from a broad class, and can be positive definite or not, which adapts to various requirements in kernel learning. Theoretically, we study the convergence behavior of regularized regression algorithms in hyper-RKHS and derive the learning rates, which goes beyond the classical analysis on RKHS due to the non-trivial independence of pairwise samples and the characterisation of hyper-RKHS. Experimentally, results on several benchmarks suggest that the employed framework is able to learn a general kernel function form an arbitrary similarity matrix, and thus achieves a satisfactory performance on classification tasks.
Year
DOI
Venue
2021
v22/19-482.html
JOURNAL OF MACHINE LEARNING RESEARCH
Keywords
DocType
Volume
hyper-RKHS, approximation theory, kernel learning, out-of-sample extensions
Journal
22
Issue
ISSN
Citations 
1
1532-4435
0
PageRank 
References 
Authors
0.34
0
5
Name
Order
Citations
PageRank
Fanghui Liu15910.60
Lei Shi2113.64
Xiaolin Huang324237.33
Jie Yang486887.15
J. A. Suykens5305.97