Hadamard Decoding
Instructions

Theory Overview

Maximum Likelihood (ML) decoding minimizes the error probability for Reed-Muller codes on Gaussian channels. For first-order codes, \(RM(1,m)\), this is efficiently achieved using the Fast Hadamard Transform (FHT).

1. Signal Model

Binary bits are mapped to real values: \(0 \to +1\) and \(1 \to -1\). The received vector is \(\mathbf{y} = \mathbf{c}_{bip} + \mathbf{n}\), where \(\mathbf{n}\) is noise.

2. The Hadamard Matrix

The decoding relies on the recursive Sylvester-Hadamard matrix \(H_N\): $$ H_2 = \begin{pmatrix} 1 & 1 \\ 1 & -1 \end{pmatrix}, \quad H_{2^m} = \begin{pmatrix} H_{2^{m-1}} & H_{2^{m-1}} \\ H_{2^{m-1}} & -H_{2^{m-1}} \end{pmatrix} $$ For \(RM(1,2)\), we use \(H_4\).

3. The Algorithm

To decode a received vector \(\mathbf{y}\) of length \(2^m\):

  1. Transform: Compute \(\mathbf{z} = \mathbf{y} \times H_{2^m}\).
  2. Find Max: Identify the index \(k\) where \(|z_k|\) is maximized.
  3. Decode:
    • The bits of index \(k\) (in binary) determine the linear message bits.
    • The sign of \(z_k\) determines the constant message bit (\(>0 \to 0\), \(\le 0 \to 1\)).

Procedure

  • Goal: To decode a received vector sent using the \(RM(1, 2)\) code.
  • Step 1: Observe the received real vector \(\mathbf{y}\).
  • Step 2: Compute the Fast Hadamard Transform (FHT) of \(\mathbf{y}\) to get \(\mathbf{z}\). Fill in these 4 real values in the "Part 1" boxes. Round to one decimal place.
  • Step 3: Submit Part 1. If correct, Part 2 will appear.
  • Step 4: Find the index \(k \in \{0, 1, 2, 3\}\) for which the absolute value \(|z_k|\) is the largest.
  • Step 5: Reconstruct the ML codeword \(\mathbf{\hat{c}}\). Enter this 4-bit binary codeword into the "Part 2" boxes.
  • Step 6: Click "Check Part 2" to get final feedback.
  • Note: You can click "New Question" to try another received vector.


An \(RM(1, 2)\) codeword was sent...

Part 1: Compute the Hadamard Transform

Compute \(\mathbf{z} = \text{FHT}(\mathbf{y})\). (Round to one decimal place).

Observations

Your feedback will appear here.