Title
Extension of the PAC framework to finite and countable Markov chains
Abstract
We consider a model of learning in which the successive observations follow a certain Markov chain. The observations are labeled according to a membership to some unknown target set. For a Markov chain with finitely many states we show that, if the target set belongs to a family of sets with a finite Vapnik-Chervonenkis (1995) dimension, then probably approximately correct (PAC) learning of this set is possible with polynomially large samples. Specifically for observations following a random walk with a state space 𝒳 and uniform stationary distribution, the sample size required is no more than Ω(t0/1-λ2log(t0|χ|1/δ)), where δ is the confidence level, λ2 is the second largest eigenvalue of the transition matrix, and t0 is the sample size sufficient for learning from independent and identically distributed (i.i.d.) observations. We then obtain similar results for Markov chains with countably many states using Lyapunov function technique and results on mixing properties of infinite state Markov chains.
Year
DOI
Venue
2003
10.1109/TIT.2002.806131
IEEE Transactions on Information Theory
Keywords
Field
DocType
Lyapunov methods,Markov processes,learning systems,probability,set theory,state-space methods,Lyapunov function,PAC framework extension,confidence level,eigenvalue,finite Markov chains,finite Vapnik-Chervonenkis dimension,i.i.d. observations,independent identically distributed observations,learning model,mixing properties,polynomially large samples,probably approximately correct model,random walk,set membership,state space,successive observations,transition matrix,uniform stationary distribution
Discrete mathematics,Countable set,Computer science,Markov chain,Examples of Markov chains
Journal
Volume
Issue
ISSN
49
1
0018-9448
ISBN
Citations 
PageRank 
1-58113-167-4
13
0.81
References 
Authors
10
1
Name
Order
Citations
PageRank
David Gamarnik1130.81