Title | ||
---|---|---|
The maximum mutual information between the output of a binary symmetric channel and a Boolean function of its input. |
Abstract | ||
---|---|---|
We prove the Courtade-Kumar conjecture, which states that the mutual information between any Boolean function of an $n$-dimensional vector of independent and identically distributed inputs to a memoryless binary symmetric channel and the corresponding vector of outputs is upper-bounded by $1-operatorname{H}(p)$, where $operatorname{H}(p)$ represents the binary entropy function. That is, let $mathbf{X}=[X_1...X_n]$ be a vector of independent and identically distributed Bernoulli($1/2$) random variables, which are the input to a memoryless binary symmetric channel, with the error probability equal to $0 leq p leq 1/2$, and $mathbf{Y}=[Y_1...Y_n]$ the corresponding output. Let $f:{0,1}^n rightarrow {0,1}$ be an $n$-dimensional Boolean function. Then, $operatorname{MI}(f(mathbf{X}),mathbf{Y}) leq 1-operatorname{H}(p)$. We provide the proof for the most general case of the conjecture, that is for any $n$-dimensional Boolean function $f$ and for any value of the error probability of the binary symmetric channel, $0 leq p leq 1/2$. Our proof employs only basic concepts from information theory, probability theory and transformations of random variables and vectors. |
Year | Venue | Field |
---|---|---|
2016 | arXiv: Information Theory | Information theory,Boolean function,Discrete mathematics,Random variable,Binary symmetric channel,Combinatorics,Binary entropy function,Independent and identically distributed random variables,Probability theory,Conjecture,Mathematics |
DocType | Volume | Citations |
Journal | abs/1604.05113 | 0 |
PageRank | References | Authors |
0.34 | 4 | 1 |
Name | Order | Citations | PageRank |
---|---|---|---|
Septimia Sarbu | 1 | 4 | 1.52 |