Title
Second-order asymptotics of mutual information
Abstract
A formula for the second-order expansion of the input-output mutual information of multidimensional channels as the signal-to-noise ratio (SNR) goes to zero is obtained. While the additive noise is assumed to be Gaussian, we deal with very general classes of input and channel distributions. As special cases, these channel models include fading channels, channels with random parameters, and channels with almost Gaussian noise. When the channel is unknown at the receiver, the second term in the asymptotic expansion depends not only on the covariance matrix of the input signal but also on the fourth mixed moments of its components. The study of the second-order asymptotics of mutual information finds application in the analysis of the bandwidth-power tradeoff achieved by various signaling strategies in the wideband regime.
Year
DOI
Venue
2004
10.1109/TIT.2004.831784
IEEE Transactions on Information Theory
Keywords
Field
DocType
second order,asymptotic expansion,mutual information,channel capacity,covariance matrix,gaussian noise,fading channel,input output,signal to noise ratio,information theory
Information theory,Statistical physics,Discrete mathematics,Fading,Signal-to-noise ratio,Gaussian,Mutual information,Covariance matrix,Statistics,Gaussian noise,Channel capacity,Mathematics
Journal
Volume
Issue
ISSN
50
8
0018-9448
Citations 
PageRank 
References 
61
4.52
11
Authors
2
Name
Order
Citations
PageRank
Vyacheslav V. Prelov114529.59
Sergio Verdú23956360.80