Title
On Decoding Irregular Tanner Codes With Local-Optimality Guarantees
Abstract
We consider decoding of binary linear Tanner codes using message-passing iterative decoding and linear-programming (LP) decoding in memoryless binary-input output-symmetric (MBIOS) channels. We present new certificates that are based on a combinatorial characterization for the local optimality of a codeword in irregular Tanner codes with respect to any MBIOS channel. This characterization is a generalization of (Arora , Proc. ACM Symp. Theory of Computing, 2009) and (Vontobel, Proc. Inf. Theory and Appl. Workshop, 2010) and is based on a conical combination of normalized weighted subtrees in the computation trees of the Tanner graph. These subtrees may have any finite height $h$ (even equal or greater than half of the girth of the Tanner graph). In addition, the degrees of local-code nodes in these subtrees are not restricted to two (i.e., these subtrees are not restricted to skinny trees). We prove that local optimality in this new characterization implies maximum-likelihood (ML) optimality and LP optimality, and show that a certificate can be computed efficiently. We also present a new message-passing iterative decoding algorithm, called normalized weighted min-sum (NWMS). NWMS decoding is a belief-propagation (BP) type algorithm that applies to any irregular binary Tanner code with single parity-check local codes (e.g., low-density and high-density parity-check codes). We prove that if a locally optimal codeword with respect to height parameter $h$ exists (whereby notably $h$ is not limited by the girth of the Tanner graph), then NWMS decoding finds this codeword in $h$ iterations. The decoding guarantee of the NWMS decoding algorithm applies whenever there exists a locally optimal codeword. Because local optimality of a codeword implies that it is the unique ML codeword, the decoding guarantee also provides an ML certificate for this codeword. Finally, we apply the new local-optimality characterization to regular Tanner codes, and prove lower bounds on the noise thresholds of LP decoding in MBIOS channels. When the noise is below these lower bounds, the probability that LP decoding fails to decode the transmitted codeword decays doubly exponentially in the girth of the Tanner graph.
Year
DOI
Venue
2014
10.1109/TIT.2013.2284912
Information Theory, IEEE Transactions  
Keywords
Field
DocType
binary codes,channel coding,graph theory,iterative decoding,linear codes,linear programming,maximum likelihood decoding,message passing,BP type algorithm,LP decoding,MBIOS channel,ML optimality,NWMS decoding algorithm,Tanner graph,belief-propagation type algorithm,binary linear Tanner code,irregular Tanner code decoding,linear-programming decoding,local-code node,maximum-likelihood optimality,memoryless binary-input output-symmetric channel,message-passing iterative decoding algorithm,normalized weighted min-sum,parity-check local code,Error bounds,Tanner codes,factor graphs,graph cover,lDPC codes,linear programming (LP) decoding,local optimality,max-product algorithm,maximum-likelihood (ML) certificate,memoryless binary-input output-symmetric (MBIOS) channel,message-passing algorithms,min-sum algorithm,thresholds
Graph theory,Discrete mathematics,Combinatorics,Sequential decoding,Parity-check matrix,Binary code,Decoding methods,Tanner graph,Standard array,List decoding,Mathematics
Journal
Volume
Issue
ISSN
60
1
0018-9448
Citations 
PageRank 
References 
1
0.36
24
Authors
2
Name
Order
Citations
PageRank
Nissim Halabi183.58
Guy Even221.40