Title | ||
---|---|---|
MMSE Bounds for Additive Noise Channels Under Kullback-Leibler Divergence Constraints on the Input Distribution. |
Abstract | ||
---|---|---|
Upper and lower bounds on the minimum mean square error for additive noise channels are derived when the input distribution is constrained to be close to a Gaussian reference distribution in terms of the Kullback-Leibler divergence. The upper bound is tight and is attained by a Gaussian distribution whose mean is identical to that of the reference distribution and whose covariance matrix is define... |
Year | DOI | Venue |
---|---|---|
2019 | 10.1109/TSP.2019.2951221 | IEEE Transactions on Signal Processing |
Keywords | Field | DocType |
Upper bound,Covariance matrices,Gaussian distribution,Estimation,Entropy,Cramer-Rao bounds,Minimax techniques | Social psychology,Applied mathematics,Communication channel,Psychology,Kullback–Leibler divergence | Journal |
Volume | Issue | ISSN |
67 | 24 | 1053-587X |
Citations | PageRank | References |
1 | 0.36 | 0 |
Authors | ||
4 |
Name | Order | Citations | PageRank |
---|---|---|---|
Alex Dytso | 1 | 45 | 20.03 |
Michael Fauss | 2 | 6 | 9.05 |
Abdelhak M. Zoubir | 3 | 1036 | 148.03 |
H. V. Poor | 4 | 25411 | 1951.66 |