Title
Collective Stability in Structured Prediction: Generalization from One Example.
Abstract
Structured predictors enable joint inference over multiple interdependent output variables. These models are often trained on a small number of examples with large internal structure. Existing distribution-free generalization bounds do not guarantee generalization in this setting, though this contradicts a large body of empirical evidence from computer vision, natural language processing, social networks and other fields. In this paper, we identify a set of natural conditions – weak dependence, hypothesis complexity and a new measure, collective stability – that are sufficient for generalization from even a single example, without imposing an explicit generative model of the data. We then demonstrate that the complexity and stability conditions are satisfied by a broad class of models, including marginal inference in templated graphical models. We thus obtain uniform convergence rates that can decrease significantly faster than previous bounds, particularly when each structured example is sufficiently large and the number of training examples is constant, even one.
Year
Venue
Field
2013
ICML
Small number,Empirical evidence,Computer science,Inference,Structured prediction,Stability conditions,Uniform convergence,Artificial intelligence,Graphical model,Machine learning,Generative model
DocType
Citations 
PageRank 
Conference
16
0.77
References 
Authors
17
4
Name
Order
Citations
PageRank
Ben London1777.01
Bert Huang256339.09
Ben Taskar33175209.33
Lise Getoor44365320.21