Abstract | ||
---|---|---|
We consider the “remote-sampling” problem of choosing a subset S, with |S|=s, from a set N of observable random variables, so as to obtain as much information as possible about a set T of target random variables which are not directly observable. Our criterion is that of minimizing the entropy of T conditioned on S. We confine our attention to the case in which the random variables have a joint Gaussian distribution. We demonstrate that the problem is NP-complete. We provide two methods for calculating lower bounds on the entropy: (i) a spectral method, and (ii) a continuous nonlinear relaxation. We employ these bounds in a branch-and-bound scheme to solve problem instances to optimality. |
Year | DOI | Venue |
---|---|---|
2001 | 10.1016/S0166-218X(00)00217-1 | Discrete Applied Mathematics |
Keywords | Field | DocType |
lower bound,random variable,gaussian distribution,spectral method,maximum entropy,branch and bound | Information theory,Combinatorics,Random variable,Joint probability distribution,Nonlinear system,Probability distribution,Gaussian,Principle of maximum entropy,Covariance matrix,Mathematics | Journal |
Volume | Issue | ISSN |
108 | 3 | 0166-218X |
Citations | PageRank | References |
5 | 0.83 | 4 |
Authors | ||
4 |
Name | Order | Citations | PageRank |
---|---|---|---|
Kurt M. Anstreicher | 1 | 633 | 86.40 |
Marcia Fampa | 2 | 5 | 1.17 |
Jon Lee | 3 | 5 | 0.83 |
Joy Williams | 4 | 22 | 3.35 |