Haming distance : the minimum number of places in which any two codewords differ
Draw backs of achieving the closest code words in the decoder & the performance is approached as on;y ( K & N ) tend to infinity
Linear block code
Syndrome decoding is a way to compute code victor
* It is not possible to separate code into independent blocks
* Each code bit depends on pre data bit
* We can encode using (shift register , XOR , multiplexer)
* We replace code word with code sequence
* The continent of shift register defines the state of encoder
* We use Systematic encoder and its trellis properties
* It uses Viterbi algorithm
The introduction to Turbo codes:
* We can’t approach capacity limit of Shanon’s theory
* We apply capacity limit lower than Shanon’s called cut-off rate bound
* In 1993 cut-off rate bound significantly exceeded by ICC and approached to 0.7 db of Shanon’s.
Concatenated codes :
* By increasing (k) the power of FEC codes increases but decoding complexity increases too..
* We need outer encoder to feed up the output of encoder and then decoding would be easily .
* The final encoder called (Inner encoder)
* The final code is more complex
The Drawback is error propagation which means the error that passes to the next decoder it overwhelms the ability of that code to correct the error .
We fix it and improve the sys by distribution these errors between a number of separate code words.
* We call it rectangular or block
* Interleaver denoted by ( π) and de-interleaver denoted by ( π-1 )
* The rows of ( π) = outer code word
* The columns of ( π) = inner data block
* Outer code length (k1) code length (n1)
* Inner code length (k2) code length (n2)
* Π rows (k2) contain code word of outer code
* Π columns (n1) contain parity of inner code
By encoding the parity of outer code in the inner...