LDPC Codes Incorporating Source, Noise, and Channel Memory
- This thesis discusses how memory of the source, of disturbances, or of the channel can be efficiently dealt with inside the decoding of LDPC codes. Furthermore, how such codes can be optimized for including source memory is also presented.
At the source, the memory is modeled via a Markov chain. The transition probabilities of the model are used at the decoder to estimate the source symbols. Although computed at the decoder, this information is considered to be a-priori information. The a-priori LLR can be directly incorporated into the Tanner graph, a novel simplified computation which provides equal performance to existing methods is shown. A Turbo-like scheme is also proposed where a BCJR and an LDPC decoder decode the source and received sequences iteratively, each utilizing extrinsic information computed by the other. The Turbo-like scheme performs the best at low SNRs. Subsequently, a code design algorithm is provided for obtaining optimized codes for the decoding model with direct additional links in the Tanner graph. For the optimization, density evolution is used. The optimized codes provide steeper performance curves than non-optimized ones.
Thereafter, impulse noise with memory is investigated, which is modeled by the Middleton Class-A model. A Markov model provides the transition probabilities between background and impulsive noise states. A Viterbi decoder estimates the noise sequence and an LDPC decoder estimates the transmitted symbols. Information is iterated between the decoders to improve the overall correction at the receiver. Possibilities for computing the noise states directly at the decoder are also investigated. However, the noise-memory cannot be directly incorporated into the Tanner graph. Lastly, a method is proposed to mitigate channel memory which causes inter-symbol interference using an LDPC decoder. A decision-feedback equalization like structure is used in which the intermediate LDPC decoder results are used for equalization.