Title
Stability and Generalization in Structured Prediction.
Abstract
Structured prediction models have been found to learn effectively from a few large examples- sometimes even just one. Despite empirical evidence, canonical learning theory cannot guarantee generalization in this setting because the error bounds decrease as a function of the number of examples. We therefore propose new PAC-Bayesian generalization bounds for structured prediction that decrease as a function of both the number of examples and the size of each example. Our analysis hinges on the stability of joint inference and the smoothness of the data distribution. We apply our bounds to several common learning scenarios, including max-margin and soft-max training of Markov random fields. Under certain conditions, the resulting error bounds can be far more optimistic than previous results and can even guarantee generalization from a single large example.
Year
Venue
Keywords
2016
JOURNAL OF MACHINE LEARNING RESEARCH
structured prediction,learning theory,PAC-Bayes,generalization bounds
Field
DocType
Volume
Random field,Empirical evidence,Learning theory,Inference,Structured prediction,Markov chain,Artificial intelligence,Generalization error,Smoothness,Mathematics,Machine learning
Journal
17
ISSN
Citations 
PageRank 
1532-4435
1
0.35
References 
Authors
0
3
Name
Order
Citations
PageRank
Ben London1777.01
Bert Huang256339.09
Lise Getoor34365320.21