Abstract | ||
---|---|---|
The problem of determining both the maximum and minimum entropy of a random variable Y as well as the maximum absolute value of the difference between entropies of Y and another random variable X is considered under the condition that the probability distribution of X is fixed and the error probability (i.e., the probability of noncoincidence of random values of X and Y) is given. A precise expression for the minimum entropy of Y is found. Some conditions under which the entropy of Y takes its maximum value are pointed out. In other cases, some lower and upper bounds are obtained for the maximum entropy of Y as well as for the maximum absolute value of the difference between entropies of Y and X. |
Year | DOI | Venue |
---|---|---|
2014 | 10.1134/S003294601403016 | Problems of Information Transmission |
Keywords | Field | DocType |
Entropy, Mutual Information, Diagonal Element, Error Probability, Joint Distribution | Entropy rate,Combinatorics,Conditional probability distribution,Uniform distribution (continuous),Symmetric probability distribution,Differential entropy,Principle of maximum entropy,Min entropy,Mathematics,Maximum entropy probability distribution | Journal |
Volume | Issue | ISSN |
50 | 3 | 1608-3253 |
Citations | PageRank | References |
0 | 0.34 | 8 |
Authors | ||
1 |
Name | Order | Citations | PageRank |
---|---|---|---|
Vyacheslav V. Prelov | 1 | 145 | 29.59 |