Title
McDiarmid-Type Inequalities for Graph-Dependent Variables and Stability Bounds
Abstract
A crucial assumption in most statistical learning theory is that samples are independently and identically distributed (i.i.d.). However, for many real applications, the i.i.d. assumption does not hold. We consider learning problems in which examples are dependent and their dependency relation is characterized by a graph. To establish algorithm-dependent generalization theory for learning with non-i.i.d. data, we first prove novel McDiarmid-type concentration inequalities for Lipschitz functions of graph-dependent random variables. We show that concentration relies on the forest complexity of the graph, which characterizes the strength of the dependency. We demonstrate that for many types of dependent data, the forest complexity is small and thus implies good concentration. Based on our new inequalities, we establish stability bounds for learning graph-dependent data.
Year
Venue
Keywords
2019
ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019)
dependency relation,dependent variables,statistical learning theory,identically distributed
Field
DocType
Volume
Graph,Discrete mathematics,Mathematical optimization,Computer science,Inequality,Variables
Conference
32
ISSN
Citations 
PageRank 
1049-5258
0
0.34
References 
Authors
0
5
Name
Order
Citations
PageRank
Rui100.34
Xingwu Liu21912.77
Yuyi Wang32210.01
Liwei Wang4127288.14
Zhang, Rui (Ray)500.34