Title
Fast Computation of the Kullback–Leibler Divergence and Exact Fisher Information for the First-Order Moving Average Model
Abstract
In this note expressions are derived that allow computation of the Kullback-Leibler (K-L) divergence between two first-order Gaussian moving average models in O n(1) time as the sample size n ?? ??. These expressions can also be used to evaluate the exact Fisher information matrix in On(1) time, and provide a basis for an asymptotic expression of the K-L divergence.
Year
DOI
Venue
2010
10.1109/LSP.2009.2039659
Signal Processing Letters, IEEE
Keywords
Field
DocType
Gaussian processes,information theory,moving average processes,Kullback-Leibler divergence,first-order Gaussian moving average models,fisher information,Fisher information,Kullback–Leibler divergence,moving average models
Pattern recognition,Jensen–Shannon divergence,Divergence (statistics),Artificial intelligence,Fisher information,Gaussian process,Moving average,Kullback–Leibler divergence,Moving-average model,Mathematics,Autocorrelation
Journal
Volume
Issue
ISSN
17
4
1070-9908
Citations 
PageRank 
References 
5
0.55
0
Authors
2
Name
Order
Citations
PageRank
Enes Makalic15511.54
Daniel F. Schmidt25110.68