ML Decoding of First-Order Reed-Muller Codes
Decoding RM(1,m) Codes on the AWGN Channel
While majority logic decoding is highly effective for Reed-Muller codes on binary channels, an even more efficient algorithm exists for decoding first-order Reed-Muller codes, , on the Additive White Gaussian Noise (AWGN) channel. This method, based on the Fast Hadamard Transform (FHT), performs Maximum Likelihood (ML) decoding with remarkable speed.
1. Preliminaries: ML Decoding for the AWGN Channel
First, let's establish the model. An code is a linear code of length and dimension . Its basis functions are the constant monomial and the linear monomials .
To transmit a binary codeword over an AWGN channel, we must first map its bits to real values. We use the standard bipolar mapping:
This converts the binary codeword into a bipolar vector . The channel then adds a noise vector , whose components are independent, identically distributed Gaussian random variables. The received vector is: The goal of the decoder is to find the most likely transmitted codeword given the received vector . For the AWGN channel, the Maximum Likelihood (ML) decoding rule simplifies to finding the codeword that maximizes the inner product (or correlation) with the received vector: where is the set of all bipolar codewords of the code. A brute-force search is computationally expensive.
2. The Hadamard Matrix
The key to efficient decoding lies in the structure of the Sylvester-type Hadamard matrices, which are defined recursively for orders :
- Base Case ():
- Recursive Step:
2.1 Step-by-Step Construction
The recursive definition provides a clear method for building larger Hadamard matrices.
Constructing (Base Case): The construction starts with the base matrix:
Constructing : We construct by arranging four copies of in a block structure, negating the bottom-right block: This yields the final matrix:
Constructing : Similarly, is constructed from four blocks of the matrix: Substituting the matrix for results in the final matrix:
2.2 The Crucial Connection: Codewords as Hadamard Rows
The set of all bipolar codewords of the code is precisely the set of the 8 rows of and their negatives. We denote the rows of as .
A codeword is generated by a polynomial . The linear part, , corresponds to a specific row of , while the constant term, , determines its sign. The bipolar codeword for the linear part is , where is the integer whose binary representation is .
The following table explicitly shows all 16 codewords of and their relationship to the rows of .
| Message Polynomial | Index | Binary Rep. | Binary Codeword | Corresponding Hadamard Row |
|---|---|---|---|---|
| 0 | 000 | |||
| 0 | 000 | |||
| 1 | 001 | |||
| 1 | 001 | |||
| 2 | 010 | |||
| 2 | 010 | |||
| 3 | 011 | |||
| 3 | 011 | |||
| 4 | 100 | |||
| 4 | 100 | |||
| 5 | 101 | |||
| 5 | 101 | |||
| 6 | 110 | |||
| 6 | 110 | |||
| 7 | 111 | |||
| 7 | 111 |
3. ML Decoding via the Hadamard Transform
The Hadamard Transform of a vector is the matrix-vector product . The -th component of is the inner product . This single operation computes the correlations with all candidate codewords simultaneously.
The Algorithm: Input: A received vector of length . Output: The decoded message polynomial .
- Transform: Compute the Hadamard Transform of the received vector:
- Search: Find the index corresponding to the component of with the largest absolute value over all possible indices:
- Decode: Let the binary representation of be . a. The linear part of the polynomial is , where . b. The constant term is determined by the sign of . If , then . If , then .
4. The Fast Hadamard Transform (FHT)
A direct matrix-vector multiplication takes operations. The FHT is an efficient algorithm that reduces this complexity to by exploiting the recursive structure of the Hadamard matrix.
5. Detailed Decoding Example 1: RM(1,3)
- Parameters: .
Step 1: Transmitted Codeword
- Message Polynomial: .
- Binary Codeword .
- Bipolar Codeword . This corresponds to .
Step 2: Received Vector
- Noise Vector .
- Received Vector .
Step 3: Hadamard Transform
- Compute . The resulting vector is:
Step 4: Search and Decode
- Search: By inspection, the component with the largest absolute value is . The winning index is .
- Decode: a. The binary representation of is . With ordering (), this gives . The linear part is . b. The sign of is negative, so the constant term is .
- Final Result: The decoded polynomial is . The decoding is successful.
6. Detailed Decoding Example 2: RM(1,4)
- Parameters: .
Step 1: Transmitted Codeword
- Message Polynomial: .
- The linear part corresponds to index with binary rep .
- Binary Codeword is .
- Bipolar Codeword . This corresponds to of .
Step 2: Received Vector
- A small random noise vector is added, resulting in the received vector :
Step 3: Hadamard Transform
- We compute the transform . We expect the component to have the largest magnitude.
- The calculation of the full transform yields a vector where is the maximum value. For instance, . Since is a noisy version of , this inner product will be a large positive number. A full FHT calculation would show , while all other components are significantly smaller.
Step 4: Search and Decode
- Search: The component with the largest absolute value is found to be . The winning index is .
- Decode: a. The binary representation of is . With ordering (), this gives . The linear part is . b. The sign of is positive, so the constant term is .
- Final Result: The decoded polynomial is . The decoding is successful.