Codes Correcteur d'Erreur

download Codes Correcteur d'Erreur

of 57

Transcript of Codes Correcteur d'Erreur

  • 8/13/2019 Codes Correcteur d'Erreur

    1/57

    Channel coding and error correction in digitaltransmission

    Prof. L. Vandendorpe

    UCL Communications and Remote Sensing Lab.

    1

  • 8/13/2019 Codes Correcteur d'Erreur

    2/57

    Outline

    Basics of error correcting codes

    Block and convolutional codes

    ML hard and soft decoding; the Viterbi algorithm

    2

  • 8/13/2019 Codes Correcteur d'Erreur

    3/57

    Basics of error correcting codes

    Error correcting codes improve system performance

    Emphasis not on code design but on receiver structure

    FEC: Forward Error Correction

    3

  • 8/13/2019 Codes Correcteur d'Erreur

    4/57

    Basics of error correcting codes: assumption

    Classical codes: perform well when channel errors are independent

    Assumption: Interleaving makes the channel memoryless

    Burst of errors on the channel does not appear as a burst at thedecoder input

    4

  • 8/13/2019 Codes Correcteur d'Erreur

    5/57

    Basics of error correcting codes: system model

    5

  • 8/13/2019 Codes Correcteur d'Erreur

    6/57

    Information required

    To compute the FEC peformance, one needs DMC (discrete memo-ryless channel) transition probabilities

    p[yj

    |xj, zj] (1)

    yj decoder input at channel use j

    xj coder output or interleaver input at channel use j

    z

    j channel state information j

    6

  • 8/13/2019 Codes Correcteur d'Erreur

    7/57

  • 8/13/2019 Codes Correcteur d'Erreur

    8/57

    Elementary linear block code concepts

    Binary input and output assumed

    k input bits, n output bits; code rate k/n; redundancy n k bits

    Hence 2k possible codewords out off 2n

    One-to-one mapping between input message (k bits) and codeword

    Error correction capability: not all n-uples are possible

    Choice: a number of errors must occur to confuse a codeword withanother

    8

  • 8/13/2019 Codes Correcteur d'Erreur

    9/57

  • 8/13/2019 Codes Correcteur d'Erreur

    10/57

    Properties of linear block codes

    Modulo-2 sum of 2 codewords is another codeword

    All zeros codeword is a codeword

    Take xa, xb at dH. Take any xc. Let xa = xa xc,xb = xb xc. Thend[xa, xb] =dH

    10

  • 8/13/2019 Codes Correcteur d'Erreur

    11/57

  • 8/13/2019 Codes Correcteur d'Erreur

    12/57

    Channel description

    Described by p[y|xm, z] where r= (r1, , rn)

    For a memoryless channel

    p[y|xm, z] =n

    j=1

    p[yj|xmj, zj] (2)

    If no jammer information available,

    p[y|xm] =n

    j=1

    p[yj|xmj] (3)

    12

  • 8/13/2019 Codes Correcteur d'Erreur

    13/57

    Example

    Binary symmetric channel with transition probability p

    When jammer off, noise psd N0/2; when on, (N0+ NJ)/2

    Receiver knows N0 and NJ

    For BPSK signalling pj =Q(

    2Eb/Nnj)

    J: set of channel use with active jammer (n

    1 positions)

    13

  • 8/13/2019 Codes Correcteur d'Erreur

    14/57

    Example

    With perfect channel (jammer) state information available

    p[y|xm, z] =

    j

    J

    p[yj|xmj, zj = 1]n

    j

    J

    p[yj|xmj, zj = 0] (4)

    = pd11 (1 p1)n1d1pd00 (1 p0)nn1d0 (5)

    p0 when N0, p1 when (N0+ NJ)/2

    d0 (resp. d1) Hamming between y and xm for positions J (resp. J)

    14

  • 8/13/2019 Codes Correcteur d'Erreur

    15/57

    Example

    If no channel (jammer) state information available

    p[y|xm] = pdH(1 p)ndH (6)

    with average value of p

    15

  • 8/13/2019 Codes Correcteur d'Erreur

    16/57

    Optimum decoding rule - no JSI

    Receiver knows

    received vector y,

    possible codewords xm,

    prior message probabilities

    the channel transition probabilities p[y|xm]

    Decoding rule: minimize the average number of bit errors

    16

  • 8/13/2019 Codes Correcteur d'Erreur

    17/57

  • 8/13/2019 Codes Correcteur d'Erreur

    18/57

    Optimum decoding rule - no JSI - BSC

    For the BSC channel without JSIp[y|xm] = pdH(1 p)ndH (8)

    maximize log-likelihood insteadlnp[y|xm] =dHlnp + (n dH) ln(1p) =dHln

    p

    1 p+ n ln(1p) (9)

    As ln p1p

  • 8/13/2019 Codes Correcteur d'Erreur

    19/57

    Optimum decoding rule - no JSI - soft channel

    Channel inputs1

    Gaussian channel

    p[yj|xj] = 1

    N0exp[(yj xj)

    2

    N0] (10)

    With log, minimize

    lnp[y|xm] =n

    j=1

    lnp[yj|xmj]

    = n2

    lnp[N0] 1N0

    n

    j=1

    (yj xj)2 (11)

    19

  • 8/13/2019 Codes Correcteur d'Erreur

    20/57

    Optimum decoding rule - no JSI - soft channel

    Hence minimum euclidean distance decoding

    Soft decoding always outperforms hard decoding

    Expand: Gaussian channeln

    j=1

    (yj xmj)2

    =

    n

    j=1

    y2

    j 2n

    j=1

    xmjyj+

    n

    j=1

    x2mj (12)

    Maximize the following correlation:

    n

    j=1

    xmjyj (13)

    20

  • 8/13/2019 Codes Correcteur d'Erreur

    21/57

    Optimum decoding rule - JSI - BSC

    For the BSC channel with perfect JSIp[y|xm] = pd11 (1 p1)n1d1p

    d00 (1 p0)nn1d0 (14)

    maximize log-likelihood insteadlnp[y|xm] = d1 ln

    p11 p1

    + n1 ln(1 p1)

    + d0 ln p0

    1

    p0

    + (n n1) ln(1 p0) (15)

    Use the following weighted distance:d1 ln

    p1

    1 p1+ d0 ln

    p0

    1 p0(16)

    When jamming is on, p1 0.5 and ln p11p1 0. Then minimize d0.

    21

  • 8/13/2019 Codes Correcteur d'Erreur

    22/57

  • 8/13/2019 Codes Correcteur d'Erreur

    23/57

    Example of coded/uncoded FH/MFSK

    23

  • 8/13/2019 Codes Correcteur d'Erreur

    24/57

    Convolutional coding

    Continuous mode instead of blocks

    Highly structured

    Often used with soft decision decoding

    Example of rate 1/2 convolutional code (2 output bits for 1 input bit)

    Defined by number of stages and content of shift register

    State: content of shift register

    24

  • 8/13/2019 Codes Correcteur d'Erreur

    25/57

    Example of convolution code

    25

    C

  • 8/13/2019 Codes Correcteur d'Erreur

    26/57

    Convolutional coding

    In the example: 4 states

    System with memory; result depends on input and state

    Constraint length=1+number of past inputs affecting output= 3 here

    State diagram representation (finite state machine)

    Trellis diagram representation

    26

    St t t ti

  • 8/13/2019 Codes Correcteur d'Erreur

    27/57

    State representation

    27

    T lli di

  • 8/13/2019 Codes Correcteur d'Erreur

    28/57

    Trellis diagram

    28

    Trellis diagram

  • 8/13/2019 Codes Correcteur d'Erreur

    29/57

    Trellis diagram

    4 possible states at each interval

    here time invariant trellis; could be time variant

    Decoder knows:

    code structure

    received sequence

    channel transition probabilities

    Target: decode with minimum number of errors

    29

    Decoding

  • 8/13/2019 Codes Correcteur d'Erreur

    30/57

    Decoding

    One to one correspondence between information andencoder output

    For that correspondence there is a single path in the trellis

    decoding=estimation of the path in the trellis

    For equiprobable symbols, ML decoding

    All paths are equiprobable

    30

  • 8/13/2019 Codes Correcteur d'Erreur

    31/57

    Decoding

  • 8/13/2019 Codes Correcteur d'Erreur

    32/57

    Decoding

    About the log likelihood function

    lnp[y|xm] =

    j=0

    lnp[yj|xmj] (21)

    lnp[yj|xmj] is called branch metric

    32

  • 8/13/2019 Codes Correcteur d'Erreur

    33/57

    Example

  • 8/13/2019 Codes Correcteur d'Erreur

    34/57

    Example

    34

    Soft decoding

  • 8/13/2019 Codes Correcteur d'Erreur

    35/57

    Soft decoding

    DMC with number outputs > number inputs

    in the limit: continuous output; example: output of MF with AWGN

    if hard decision: partition in 2 regions

    if soft decision: more regions

    35

  • 8/13/2019 Codes Correcteur d'Erreur

    36/57

    Soft decoding

  • 8/13/2019 Codes Correcteur d'Erreur

    37/57

    Soft decoding

    Transition prob. computed by area below curve for appropriate as-sumption

    example given for p[3|0]

    decoding rule stays the same: maximize p[y|xm] or lnp[y|xm]

    Viterbi algorithm can be used both for hard and soft decoding

    37

    Viterbi algorithm

  • 8/13/2019 Codes Correcteur d'Erreur

    38/57

    g

    Elegant to perform ML decoding; (Viterbi, 1967)

    Example given for hard decoding

    See truncated trellis

    Depth: position being discussed wrt leftmost position

    Truncated here: transmission of 0s at the end

    received sequence is shown

    38

    Trellis

  • 8/13/2019 Codes Correcteur d'Erreur

    39/57

    39

    Viterbi algorithm

  • 8/13/2019 Codes Correcteur d'Erreur

    40/57

    Compute dH from depth 0 to 1

    The metrics or distances are indicated above the candidate states

    The metrics are accumulated (additive concept)

    40

    Viterbi algorithm

  • 8/13/2019 Codes Correcteur d'Erreur

    41/57

    See depth 3; 8 paths enter depth 3; 2 per state

    For each state keep only the arriving path with lowest accumulatedmetric

    If that state is selected, the arriving path with lowest metric is nec-essarily the best

    the selected path is called surviving path

    here lowest metric because we use dH; otherwise largest

    Discarding of path: key issue in the efficiency of the VA

    41

    Viterbi algorithm

  • 8/13/2019 Codes Correcteur d'Erreur

    42/57

    Branch (0, 0)(0, 0)(0, 0) gives dH= 3 at depth 3

    Branch (1, 1)(1, 0)(1, 1) gives dH= 2 at depth 3 :keep this one

    42

    Viterbi algorithm

  • 8/13/2019 Codes Correcteur d'Erreur

    43/57

    43

    Viterbi algorithm

  • 8/13/2019 Codes Correcteur d'Erreur

    44/57

    44

  • 8/13/2019 Codes Correcteur d'Erreur

    45/57

    Viterbi algorithm

  • 8/13/2019 Codes Correcteur d'Erreur

    46/57

    sometimes different paths may have equal metric

    keep both or discard one

    here : solved at next step

    46

    Viterbi algorithm

  • 8/13/2019 Codes Correcteur d'Erreur

    47/57

    47

    Viterbi algorithm

  • 8/13/2019 Codes Correcteur d'Erreur

    48/57

    48

    Viterbi algorithm

  • 8/13/2019 Codes Correcteur d'Erreur

    49/57

    49

    Soft decision decoding

  • 8/13/2019 Codes Correcteur d'Erreur

    50/57

    Identical procedure

    Now keep largest metric; we use lnp[y|xm]

    50

    Soft decision decoding: example

  • 8/13/2019 Codes Correcteur d'Erreur

    51/57

    Assume the following DMC channel with the transition probabilitiesreported

    51

    Soft decision decoding: branch and cumulative metrics

  • 8/13/2019 Codes Correcteur d'Erreur

    52/57

    52

    Viterbi algorithm: remarks

  • 8/13/2019 Codes Correcteur d'Erreur

    53/57

    Here paths forced to (0,0) by sending 0s at the end

    Sometimes required to take decisions before zeroing

    Assume no zeroing and S states; S entering paths (surviving)

    trace the paths backwards

    High probability that they merge at some depth

    Reliable decision can be taken for the bits before the merging

    Depth to trace back is a random variable

    Usually: fixed memory; decision forced after that depth

    53

    Concatenated Reed-Solomon/convolutional coding

  • 8/13/2019 Codes Correcteur d'Erreur

    54/57

    54

    Concatenated Reed-Solomon/convolutional coding

  • 8/13/2019 Codes Correcteur d'Erreur

    55/57

    R-S (non binary) code: outer code

    convolutional code : inner code

    Superchannel : DMC

    errors in Viterbi decoder occur in burst

    therefore int/deint before R-S decoding

    makes the superchannel memoryless

    55

    Interleaving

  • 8/13/2019 Codes Correcteur d'Erreur

    56/57

    makes the channel memoryless

    errors occur in burst

    code designed for non burst errors

    block or convolutional interleaving

    56

  • 8/13/2019 Codes Correcteur d'Erreur

    57/57