Title
A Proof of Entropy Minimization for Outputs in Deletion Channels via Hidden Word Statistics.
Abstract
From the output produced by a memoryless deletion channel from a uniformly random input of known length $n$, one obtains a posterior distribution on the channel input. The difference between the Shannon entropy of this distribution and that of the uniform prior measures the amount of information about the channel input which is conveyed by the output of length $m$, and it is natural to ask for which outputs this is extremized. This question was posed in a previous work, where it was conjectured on the basis of experimental data that the entropy of the posterior is minimized and maximized by the constant strings $texttt{000}ldots$ and $texttt{111}ldots$ and the alternating strings $texttt{0101}ldots$ and $texttt{1010}ldots$ respectively. In the present work we confirm the minimization conjecture in the asymptotic limit using results from hidden word statistics. We show how the analytic-combinatorial methods of Flajolet, Szpankowski and Vallu0027ee for dealing with the hidden pattern matching problem can be applied to resolve the case of fixed output length and $nrightarrowinfty$, by obtaining estimates for the entropy in terms of the moments of the posterior distribution and establishing its minimization via a measure of autocorrelation.
Year
Venue
Field
2018
arXiv: Information Theory
Analytic combinatorics,Discrete mathematics,Posterior probability,Minification,Deletion channel,Statistics,Pattern matching,Entropy (information theory),Conjecture,Mathematics,Autocorrelation
DocType
Volume
Citations 
Journal
abs/1807.11609
0
PageRank 
References 
Authors
0.34
0
4
Name
Order
Citations
PageRank
Arash Atashpendar193.21
David Mestel201.01
A. W. Roscoe33125.90
Peter Y. A. Ryan472866.96