Title
New Generalization Bounds for Learning Kernels
Abstract
This paper presents several novel generalization bounds for the problem of learning kernels based on the analysis of the Rademacher complexity of the corresponding hypothesis sets. Our bound for learning kernels with a convex combination of p base kernels has only a log(p) dependency on the number of kernels, p, which is considerably more favorable than the previous best bound given for the same problem. We also give a novel bound for learning with a linear combination of p base kernels with an L_2 regularization whose dependency on p is only in p^{1/4}.
Year
Venue
Keywords
2009
international conference on machine learning
convex combination,artificial intelligent
Field
DocType
Volume
Discrete mathematics,Convex combination,Rademacher complexity,Regularization (mathematics),Artificial intelligence,Generalization error,Combinatorial analysis,Linear function,Machine learning,Mathematics
Journal
abs/0912.3
Citations 
PageRank 
References 
51
1.94
21
Authors
3
Name
Order
Citations
PageRank
Corinna Cortes165741120.50
Mehryar Mohri24502448.21
Afshin Rostamizadeh391144.15