Title
Global Rademacher Complexity Bounds: From Slow to Fast Convergence Rates
Abstract
Previous works in literature showed that performance estimations of learning procedures can be characterized by a convergence rate ranging between $$O(n^{-1})$$O(n-1) (fast rate) and $$O(n^{-{1}/{2}})$$O(n-1/2) (slow rate). In order to derive such result, some assumptions on the problem are required; moreover, even when convergence is fast, the constants characterizing the bounds are often quite loose. In this work, we prove new Rademacher complexity (RC) based bounds, which do not require any additional assumptions for achieving a fast convergence rate $$O(n^{-1})$$O(n-1) in the optimistic case and a slow rate $$O(n^{-{1}/{2}})$$O(n-1/2) in the general case. At the same time, they are characterized by smaller constants with respect to other state-of-the-art RC fast converging alternatives in literature. The results proposed in this work are obtained by exploiting the fundamental work of Talagrand on concentration inequalities for product measures and empirical processes. As a further issue, we also provide the extension of the results to the semi-supervised learning framework, showing how additional unlabeled samples allow improving the tightness of the derived bound.
Year
DOI
Venue
2016
10.1007/s11063-015-9429-2
Neural Processing Letters
Keywords
Field
DocType
Statistical learning theory,Performance estimation,Rademacher complexity,Fast rates
Convergence (routing),Statistical learning theory,Combinatorics,Rademacher complexity,Performance estimation,Rate of convergence,Artificial intelligence,Mathematics,Machine learning
Journal
Volume
Issue
ISSN
43
2
1370-4621
Citations 
PageRank 
References 
11
0.50
40
Authors
4
Name
Order
Citations
PageRank
Luca Oneto183063.22
Alessandro Ghio266735.71
Sandro Ridella3677140.62
Davide Anguita4100170.58