Message Passing Decoding of LDPC Codes on Other Channels
We now describe the general message passing algorithm for achieving Bit-wise Maximum Aposterior Probability (MAP) Decoding of linear codes on general channels. We focus on Additive White Gaussian Noise (AWGN) channels, but the principles can be adapted to other channel models.
Bit-wise MAP decoding principle for linear codes
Let x=(x1,…,xn)∈C be the codeword transmitted from the linear code C of dimension k. Let y=(y1,…,yn) be the received vector from the channel. The bit-wise MAP estimate of the bit xi is based on the value of the log-likelihood ratio (LLR) of xi given y, denoted by L(xi∣y), and defined as follows.
L(xi)≜log(p(xi=1∣y)p(xi=0∣y)).
In the case of AWGN channels, we assume without loss of generality that xi∈{+1,−1}, under a bipolar signalling scheme. Thus, the LLR L(xi) is then defined as
L(xi)≜log(p(xi=−1∣y)p(xi=+1∣y)).
The bitwise MAP estimate for xi is then denoted by x^i and calculated as follows.
x^i={0,1,ifL(xi)>0ifL(xi)<0.
Note that we assume ties are broken arbitrarily, i.e., the decoder choose the estimate x^i randomly if L(xi∣y)=0.
As before, we assume that the Tanner graph of the code C is cycle-free, i.e., the Tanner graph has no sequence of edges between adjacent vertices which start and end at the same vertex. Thus, the Tanner graph is a tree. With this assumption in mind, the belief propagation approach is described on the Tanner graph of the code, which leads to an efficient computation of the Bit-wise MAP of each bit in the codeword transmitted.
Message Passing for LDPC Codes on AWGN Channels
The Min-Sum algorithm is an iterative procedure where messages, representing LLRs, are passed between variable nodes and check nodes on the Tanner graph.
The Min-Sum Decoding Algorithm
The algorithm consists of initialization followed by a series of iterations until a termination condition is met.
Step 1: Initialization
Channel LLRs: For each received value yi from the AWGN channel with noise variance σ2=N0/2 and assuming BPSK modulation (0↦+1,1↦−1), the initial channel LLR is calculated as:
Lch(xi)=σ22yi
This value represents the initial belief about bit xi from the channel observation alone.
Message Initialization: Messages from check nodes to variable nodes are set to zero before the first iteration. Let Lj→i(k) be the message from check node cj to variable node vi in iteration k. Then, Lj→i(0)=0 for all edges (vi,cj).
A. Variable Node (VN) to Check Node (CN) Update:
Each variable node vi sends a message Li→j(k) to each adjacent check node cj. This message is the sum of the channel LLR and all incoming messages from other check nodes in the previous iteration.
Li→j(k)=Lch(xi)+j′∈N(vi)∖{j}∑Lj′→i(k−1)
where N(vi) is the set of check nodes connected to vi.
B. Check Node (CN) to Variable Node (VN) Update:
Each check node cj sends a message Lj→i(k) to each adjacent variable node vi. The Min-Sum approximation provides a simplified rule for this update:
Lj→i(k)=i′∈N(cj)∖{i}∏sgn(Li′→j(k))×i′∈N(cj)∖{i}minLi′→j(k)
where N(cj) is the set of variable nodes connected to cj. The message's sign is determined by the product of the signs of incoming messages (to satisfy the parity check), and its magnitude is the minimum of the incoming magnitudes.
Step 3: LLR Aggregation and Decision
After the check node update, each variable node aggregates all its information to compute a total LLR and make a hard decision.
Total LLR:Ltotal(k)(xi)=Lch(xi)+∑j∈N(vi)Lj→i(k).
Hard Decision: A temporary decoded vector x^(k) is formed using the rule: x^i=0 if Ltotal(k)(xi)≥0, and x^i=1 otherwise.
Step 4: Termination
The iterations stop if either of these conditions is met:
Valid Codeword: The hard decision vector x^(k) satisfies all parity checks (i.e., H(x^(k))T=0). The algorithm terminates successfully.
Maximum Iterations: The algorithm reaches a predefined maximum number of iterations, kmax. If the result is not a valid codeword, a decoding failure is declared.
A Worked Example of Min-Sum Decoding
Let's illustrate the algorithm with a concrete example.
Scenario Setup
Code: A rate-1/2, length-6 LDPC code defined by the parity-check matrix:
H=110100101010010001
The Tanner graph for this code is cycle-free (a tree).
C. LLR Update and Decision (End of Round 1)
We sum the channel LLR and incoming messages for each bit.
Bit (i)
Lch(xi)
Incoming Messages Sum
Ltotal(1)(xi)
x^i(1)
0
-2.0
L0→0(1)+L1→0(1)=−0.8+3.2=+2.4
+0.4
0
1
-0.8
L0→1(1)=−2.0
-2.8
1
2
+4.4
L0→2(1)+L2→2(1)=+0.8+1.6=+2.4
+6.8
0
3
+3.2
L1→3(1)=−2.0
+1.2
0
4
+6.0
L1→4(1)=−2.0
+4.0
0
5
+1.6
L2→5(1)=+4.4
+6.0
0
Analysis: After one iteration, the LLR for bit x0 has flipped sign from negative to positive, correcting the first error. The temporary codeword is x^(1)=(0,1,0,0,0,0). This is not a valid codeword, so we proceed.
Iteration 2
A. Variable Node → Check Node Update
We use the messages Lj→i(1) from the previous step.
Conclusion: After the second iteration, the LLR for bit x1 has also flipped to positive. The hard decision vector is now x^(2)=(0,0,0,0,0,0). We check this against the parity-check matrix: H(x^(2))T=0. The codeword is valid, and the algorithm terminates successfully with the correct decoded codeword.