Title
Generic Bounds On The Maximum Deviations In Sequential Prediction: An Information-Theoretic Analysis
Abstract
In this paper, we derive generic bounds on the maximum deviations in prediction errors for sequential prediction via an information-theoretic approach. The fundamental bounds are shown to depend only on the conditional entropy of the data point to be predicted given the previous data points. In the asymptotic case, the bounds are achieved if and only if the prediction error is white and uniformly distributed.
Year
DOI
Venue
2019
10.1109/MLSP.2019.8918758
2019 IEEE 29th International Workshop on Machine Learning for Signal Processing (MLSP)
Keywords
Field
DocType
Information-theoretic learning,sequential learning,sequential prediction,bounds on performance,sequence prediction
Sequence prediction,Data point,Mean squared prediction error,Random variable,Pattern recognition,Computer science,Algorithm,Stochastic process,If and only if,Artificial intelligence,Conditional entropy,Sequence learning
Conference
ISSN
ISBN
Citations 
1551-2541
978-1-7281-0825-4
1
PageRank 
References 
Authors
0.41
4
2
Name
Order
Citations
PageRank
Fang Song12310.76
Zhu Quanyan21295116.31