SlideShare a Scribd company logo
1 of 36
Introduction to Information
theory
channel capacity and models
                    A.J. Han Vinck
               University of Essen
                         May 2011
This lecture

   Some models
   Channel capacity
       Shannon channel coding theorem
       converse
some channel models

     Input X           P(y|x)              output Y


                transition probabilities

memoryless:
- output at time i depends only on input at time i
- input and output alphabet finite
Example: binary symmetric channel (BSC)


                                                1-p
           Error Source
                                        0               0
                    E
                                            p
      X                   Y = X ⊕E
                +                       1               1
   Input                  Output
                                                 1-p


E is the binary error sequence s.t. P(1) = 1-P(0) = p
X is the binary information sequence
Y is the binary output sequence
from AWGN
to BSC




                                                    p




Homework: calculate the capacity as a function of A and σ2
Other models

                                           1-e
0                     0 (light on)    0                  0
                                           e
 X          p     Y
            1-p       1 (light off)
1
                                           e
                                      1                  1
    P(X=0) = P0

                                           P(X=0) = P0

Z-channel (optical)                       Erasure channel
(MAC)
Erasure with errors


                    1-p-e
            0               0
                      e
                p


                p     e
            1               1
                    1-p-e
burst error model (Gilbert-Elliot)

  Random error channel; outputs independent
         Error Source         P(0) = 1- P(1);


      Burst error channel; outputs dependent
                              P(0 | state = bad ) = P(1|state = bad ) = 1/2;
         Error Source
                              P(0 | state = good ) = 1 - P(1|state = good ) = 0.999

       State info: good or bad                     transition probability
                        Pgb
Pgg          good               bad          Pbb
                        Pbg
channel capacity:

  I(X;Y) = H(X) - H(X|Y) = H(Y) – H(Y|X) (Shannon 1948)

               X                      Y
     H(X)                   channel       H(X|Y)


                max I(X; Y) = capacity
                   P( x )

  notes:
     capacity depends on input probabilities
     because the transition probabilites are fixed
Practical communication system design
             Code book


                           Code                    receive
message                    word in
                                                                         estimate
              2k                      channel                decoder

                                                             Code book
                                     with errors


           n
 There are 2k code words of length n
 k is the number of information bits transmitted in n channel uses
Channel capacity
 Definition:
 The rate R of a code is the ratio k/n, where
       k is the number of information bits transmitted in n channel uses

 Shannon showed that: :
       for R ≤ C
       encoding methods exist
       with decoding error probability                  0
Encoding and decoding according to Shannon

Code: 2k binary codewords where p(0) = P(1) = ½
Channel errors: P(0 →1) = P(1 → 0) = p
            i.e. # error sequences ≈ 2nh(p)
Decoder: search around received sequence for codeword
            with ≈ np differences


                                     space of 2n binary sequences
decoding error probability

1.      For t errors: |t/n-p|> Є
        → 0 for n → ∞
        (law of large numbers)

2. > 1 code word in region
        (codewords random)


                          2 nh ( p)
     P(> 1) ≈ (2 k − 1)
                            2n
     → 2 − n (1− h ( p)− R ) = 2 − n (C BSC − R ) → 0
                     k
     for R = < 1 − h (p)
                     n
     and n → ∞
channel capacity: the BSC

            1-p                I(X;Y) = H(Y) – H(Y|X)

   0                  0        the maximum of H(Y) = 1
  X     p             Y                          since Y is binary

   1                  1        H(Y|X) = h(p)
             1-p                                 = P(X=0)h(p) + P(X=1)h(p)



  Conclusion: the capacity for the BSC CBSC = 1- h(p)
  Homework: draw CBSC , what happens for p > ½
channel capacity: the BSC


                                                   Explain the behaviour!
1.0
      Channel capacity




                         0.5                 1.0
                               Bit error p
channel capacity: the Z-channel

    Application in optical communications


0                     0 (light on)    H(Y) = h(P0 +p(1- P0 ) )
 X          p     Y
                                      H(Y|X) = (1 - P0 ) h(p)
            1-p       1 (light off)
1
                                      For capacity,
    P(X=0) = P0                              maximize I(X;Y) over P0
channel capacity: the erasure channel

 Application: cdma detection

    1-e
0                 0       I(X;Y) = H(X) – H(X|Y)
    e
                                      H(X) = h(P0 )
X                     Y
                                      H(X|Y) = e h(P0)
    e
1                 1

                          Thus Cerasure = 1 – e
    P(X=0) = P0
                          (check!, draw and compare with BSC and Z)
Erasure with errors: calculate the capacity!


                     1-p-e
             0               0
                       e
                 p


                 p     e
             1               1
                     1-p-e
0             0
                                                          1/3
                                   example           1             1
                                                          1/3

                                                     2             2
    Consider the following example

    For P(0) = P(2) = p, P(1) = 1-2p

     H(Y) = h(1/3 – 2p/3) + (2/3 + 2p/3); H(Y|X) = (1-2p)log23


     Q: maximize H(Y) – H(Y|X) as a function of p
     Q: is this the capacity?

    hint use the following: log2x = lnx / ln 2; d lnx / dx = 1/x
channel models: general diagram

               P1|1                        y1
   x1
                                P2|1            Input alphabet X = {x1, x2, …, xn}
        P1|2
   x2                                      y2
                             P2|2               Output alphabet Y = {y1, y2, …, ym}
        :                                       Pj|i = PY|X(yj|xi)
                                       :
        :
                                       :
        :                                       In general:
                                       :
   xn                                           calculating capacity needs more
                      Pm|n
                                                theory
                                           ym
The statistical behavior of the channel is completely defined by
the channel transition probabilities Pj|i = PY|X(yj|xi)
* clue:

   I(X;Y)
            is convex ∩ in the input probabilities


            i.e. finding a maximum is simple
Channel capacity: converse


 For R > C   the decoding error probability > 0




      Pe



                               k/n
                     C
Converse:              For a discrete memory less channel



                                                channel

                                       Xi                          Yi
                                 n                 n                 n                    n
        I ( X ; Y ) = H (Y ) − ∑ H (Yi | X i ) ≤ ∑ H (Yi ) − ∑ H (Yi | X i ) = ∑ I ( X i ; Yi ) ≤ nC
            n   n          n

                                i =1              i =1              i =1              i =1




 Source generates one
                                source                   encoder                channel            decoder
 out of 2k equiprobable
                                            m                              Xn                 Yn             m‘
 messages

                               Let Pe = probability that m‘ ≠ m
converse    R := k/n


k = H(M) = I(M;Yn)+H(M|Yn)
                                  1 – C n/k - 1/k ≤ Pe
         ≤ I(Xn;Yn) + 1 + k Pe
         ≤ nC + 1 + k Pe



   Pe ≥ 1 – C/R - 1/nR
   Hence:       for large n, and R > C,
                the probability of error Pe > 0
We used the data processing theorem
Cascading of Channels



                                I(X;Z)

     X                  Y                       Z
                   I(X;Y)                  I(Y;Z)



The overall transmission rate I(X;Z) for the cascade can
not be larger than I(Y;Z), that is:
                            I(X; Z) ≤ I(Y; Z)
Appendix:

Assume:
      binary sequence P(0) = 1 – P(1) = 1-p
      t is the # of 1‘s in the sequence
Then n → ∞ , ε > 0
      Weak law of large numbers
      Probability ( |t/n –p| > ε ) → 0


i.e. we expect with high probability pn 1‘s
Appendix:

  Consequence:

       1.       n(p- ε) < t < n(p + ε) with high probability
                 n ( p + ε)   n      n
       2.            ∑          ≈ 2nε  ≈ 2nε2 nh ( p)
                              t       pn 
                 n ( p −ε)            
                      1 log 2nε  n 
                  lim n 2               → h ( p)   h (p) = − p log 2 p − (1 − p) log 2 (1 − p)
       3.        n→ ∞
                                 pn 
                                 
Homework: prove the approximation using ln N! ~ N lnN for N large.

                                                            N −N
Or use the Stirling approximation:      N ! → 2π N N e
Binary Entropy:                               h(p) = -plog2p – (1-p) log2 (1-p)



     1
h
    0.9                                                                     Note:
    0.8
                                                                            h(p) = h(1-p)
    0.7

    0.6

    0.5

    0.4

    0.3

    0.2

    0.1

     0
          0   0.1   0.2   0.3   0.4   0.5   0.6   0.7   0.8   0.9       1
                                                                    p
Capacity for Additive White Gaussian Noise

                                 Noise

       Input X                                Output Y

    Cap := sup [H(Y) − H( Noise)]
                p( x )

             x 2 ≤S / 2 W                  W is (single sided) bandwidth


  Input X is Gaussian with power spectral density (psd) ≤S/2W;

  Noise is Gaussian with psd = σ2noise

  Output Y is Gaussian with psd = σy2 = S/2W + σ2noise

          For Gaussian Channels: σy2 = σx2 +σnoise2
Noise

X                                       Y                 X                       Y



              Cap = 1 log 2 (2πe(σ 2 + σ 2 )) − 1 log 2 (2πeσ 2 ) bits / trans.
                    2              x     noise  2             noise


                             σ2 + σ2
                              noise x
               = 1 log 2 (
                 2
                                                 ) bits / trans.
                                  σ   2
                                      noise

                                   σ2 + S / 2W
                                    noise
              Cap = W log 2 (                            ) bits / sec .
                                              σ2
                                               noise




          1          −z2 / 2 σ2
p(z) =           e            z
                                  ; H(Z) = 2 log2 (2πeσ2 ) bits
                                           1
                                                       z
         2πσ2
            z
Middleton type of burst channel model
                   0                      0


                   1                      1
                         Transition
                       probability P(0)

                       channel 1


                       channel 2




Select channel k            …
with probability   channel k has
Q(k)               transition
                   probability p(k)
Fritzman model:

multiple states G and only one state B
   Closer to an actual real-world channel




                                              1-p
         G1    …          Gn          B
                Error probability 0         Error probability h
Interleaving: from bursty to random

                                   bursty

Message        interleaver         channel          interleaver   -1
                                                                        message
encoder                                                                 decoder



                              „random error“

   Note: interleaving brings encoding and decoding delay


   Homework: compare the block and convolutional interleaving w.r.t. delay
Interleaving: block

 Channel models are difficult to derive:
        - burst definition ?
        - random and burst errors ?
 for practical reasons: convert burst into random error


read in row wise    1   0   1     0        1
                                               transmit
                    0   1   0     0        0

                    0   0   0     1        0   column wise
                    1   0   0     1        1

                    1   1   0     0        1
De-Interleaving: block



 read in column              1   0   1   e   1

                                                 read out
 wise
                             0   1   e   e   0

                             0   0   e   1   0

 this row contains 1 error   1   0   e   1   1
                                                 row wise
                             1   1   e   0   1
Interleaving: convolutional

 input sequence 0
 input sequence 1           delay of b elements
                      •••
 input sequence m-1         delay of (m-1)b elements

                                         in
 Example: b = 5, m = 3

                                                       out

More Related Content

What's hot

Digital Communication: Information Theory
Digital Communication: Information TheoryDigital Communication: Information Theory
Digital Communication: Information TheoryDr. Sanjay M. Gulhane
 
Pulse Code Modulation (PCM)
Pulse Code Modulation (PCM)Pulse Code Modulation (PCM)
Pulse Code Modulation (PCM)Arun c
 
Digital communication systems
Digital communication systemsDigital communication systems
Digital communication systemsNisreen Bashar
 
Correlative level coding
Correlative level codingCorrelative level coding
Correlative level codingsrkrishna341
 
Information Theory Coding 1
Information Theory Coding 1Information Theory Coding 1
Information Theory Coding 1Mahafuz Aveek
 
noise in pcm | Communication Systems
noise in pcm | Communication Systemsnoise in pcm | Communication Systems
noise in pcm | Communication SystemsLearn By Watch
 
Numerical Problems on Ampitude Modulation
Numerical Problems on Ampitude ModulationNumerical Problems on Ampitude Modulation
Numerical Problems on Ampitude ModulationRenji K
 
MIMO-OFDM for 4G network
MIMO-OFDM for 4G networkMIMO-OFDM for 4G network
MIMO-OFDM for 4G networknimay1
 
Encoder for (7,3) cyclic code using matlab
Encoder for (7,3) cyclic code using matlabEncoder for (7,3) cyclic code using matlab
Encoder for (7,3) cyclic code using matlabSneheshDutta
 
Code Division Multiple Access- CDMA
Code Division Multiple Access- CDMA Code Division Multiple Access- CDMA
Code Division Multiple Access- CDMA ViVek Patel
 
MIMO Channel Capacity
MIMO Channel CapacityMIMO Channel Capacity
MIMO Channel CapacityPei-Che Chang
 
OKUMURA, HATA and COST231 Propagation Models
OKUMURA, HATA and COST231 Propagation ModelsOKUMURA, HATA and COST231 Propagation Models
OKUMURA, HATA and COST231 Propagation ModelsMohammed Abuibaid
 
Chapter 7 multiple access techniques
Chapter 7 multiple access techniquesChapter 7 multiple access techniques
Chapter 7 multiple access techniquesKaushal Kabra
 

What's hot (20)

Digital Communication: Information Theory
Digital Communication: Information TheoryDigital Communication: Information Theory
Digital Communication: Information Theory
 
Pulse Code Modulation (PCM)
Pulse Code Modulation (PCM)Pulse Code Modulation (PCM)
Pulse Code Modulation (PCM)
 
Digital communication systems
Digital communication systemsDigital communication systems
Digital communication systems
 
Information theory
Information theoryInformation theory
Information theory
 
Dc unit iii final ppt
Dc unit iii final pptDc unit iii final ppt
Dc unit iii final ppt
 
Correlative level coding
Correlative level codingCorrelative level coding
Correlative level coding
 
Information Theory Coding 1
Information Theory Coding 1Information Theory Coding 1
Information Theory Coding 1
 
Mimo
MimoMimo
Mimo
 
noise in pcm | Communication Systems
noise in pcm | Communication Systemsnoise in pcm | Communication Systems
noise in pcm | Communication Systems
 
Spread spectrum
Spread spectrumSpread spectrum
Spread spectrum
 
Gsm channels concept
Gsm channels conceptGsm channels concept
Gsm channels concept
 
Numerical Problems on Ampitude Modulation
Numerical Problems on Ampitude ModulationNumerical Problems on Ampitude Modulation
Numerical Problems on Ampitude Modulation
 
MIMO-OFDM for 4G network
MIMO-OFDM for 4G networkMIMO-OFDM for 4G network
MIMO-OFDM for 4G network
 
Earlang ejercicios
Earlang ejerciciosEarlang ejercicios
Earlang ejercicios
 
Encoder for (7,3) cyclic code using matlab
Encoder for (7,3) cyclic code using matlabEncoder for (7,3) cyclic code using matlab
Encoder for (7,3) cyclic code using matlab
 
Code Division Multiple Access- CDMA
Code Division Multiple Access- CDMA Code Division Multiple Access- CDMA
Code Division Multiple Access- CDMA
 
Multiple access techniques for wireless communications
Multiple access techniques for wireless communicationsMultiple access techniques for wireless communications
Multiple access techniques for wireless communications
 
MIMO Channel Capacity
MIMO Channel CapacityMIMO Channel Capacity
MIMO Channel Capacity
 
OKUMURA, HATA and COST231 Propagation Models
OKUMURA, HATA and COST231 Propagation ModelsOKUMURA, HATA and COST231 Propagation Models
OKUMURA, HATA and COST231 Propagation Models
 
Chapter 7 multiple access techniques
Chapter 7 multiple access techniquesChapter 7 multiple access techniques
Chapter 7 multiple access techniques
 

Viewers also liked (20)

Channel coding
Channel coding  Channel coding
Channel coding
 
Basics of Coding Theory
Basics of Coding TheoryBasics of Coding Theory
Basics of Coding Theory
 
Linear block code
Linear block codeLinear block code
Linear block code
 
Source coding
Source codingSource coding
Source coding
 
Source coding
Source codingSource coding
Source coding
 
Convolution codes and turbo codes
Convolution codes and turbo codesConvolution codes and turbo codes
Convolution codes and turbo codes
 
Information theory & coding (ECE)
Information theory & coding (ECE)Information theory & coding (ECE)
Information theory & coding (ECE)
 
Convolution Codes
Convolution CodesConvolution Codes
Convolution Codes
 
LDPC Encoding and Hamming Encoding
LDPC Encoding and Hamming EncodingLDPC Encoding and Hamming Encoding
LDPC Encoding and Hamming Encoding
 
Basics of coding theory
Basics of coding theoryBasics of coding theory
Basics of coding theory
 
02 ldpc bit flipping_decoding_dark knight
02 ldpc bit flipping_decoding_dark knight02 ldpc bit flipping_decoding_dark knight
02 ldpc bit flipping_decoding_dark knight
 
LDPC Encoding
LDPC EncodingLDPC Encoding
LDPC Encoding
 
LDPC
LDPCLDPC
LDPC
 
LDPC - Low Density Parity Check Matrix
LDPC - Low Density Parity Check MatrixLDPC - Low Density Parity Check Matrix
LDPC - Low Density Parity Check Matrix
 
Hmong Cinderella
Hmong CinderellaHmong Cinderella
Hmong Cinderella
 
Virtual Narrowband NWA calibration from Wideband Error Term Data
Virtual Narrowband NWA calibration from Wideband Error Term DataVirtual Narrowband NWA calibration from Wideband Error Term Data
Virtual Narrowband NWA calibration from Wideband Error Term Data
 
LDPC Codes
LDPC CodesLDPC Codes
LDPC Codes
 
Convolutional Coding
Convolutional CodingConvolutional Coding
Convolutional Coding
 
Smart antennas
Smart antennasSmart antennas
Smart antennas
 
3rd and 4th semester Electronics and Communication Engineering (2010 Scheme) ...
3rd and 4th semester Electronics and Communication Engineering (2010 Scheme) ...3rd and 4th semester Electronics and Communication Engineering (2010 Scheme) ...
3rd and 4th semester Electronics and Communication Engineering (2010 Scheme) ...
 

Similar to Channel coding

Case Study (All)
Case Study (All)Case Study (All)
Case Study (All)gudeyi
 
Tele3113 wk1wed
Tele3113 wk1wedTele3113 wk1wed
Tele3113 wk1wedVin Voro
 
Proximal Splitting and Optimal Transport
Proximal Splitting and Optimal TransportProximal Splitting and Optimal Transport
Proximal Splitting and Optimal TransportGabriel Peyré
 
Tele4653 l9
Tele4653 l9Tele4653 l9
Tele4653 l9Vin Voro
 
Binary symmetric channel review
Binary symmetric channel reviewBinary symmetric channel review
Binary symmetric channel reviewShilpaDe
 
The gaussian minimum entropy conjecture
The gaussian minimum entropy conjectureThe gaussian minimum entropy conjecture
The gaussian minimum entropy conjecturewtyru1989
 
Markov Tutorial CDC Shanghai 2009
Markov Tutorial CDC Shanghai 2009Markov Tutorial CDC Shanghai 2009
Markov Tutorial CDC Shanghai 2009Sean Meyn
 
ECCM 2010 in Paris
ECCM 2010 in ParisECCM 2010 in Paris
ECCM 2010 in ParisFabian Wein
 
Acoustic near field topology optimization of a piezoelectric loudspeaker
Acoustic near field topology optimization of a piezoelectric loudspeakerAcoustic near field topology optimization of a piezoelectric loudspeaker
Acoustic near field topology optimization of a piezoelectric loudspeakerFabian Wein
 
Acoustic near field topology optimization of a piezoelectric loudspeaker
Acoustic near field topology optimization of a piezoelectric loudspeakerAcoustic near field topology optimization of a piezoelectric loudspeaker
Acoustic near field topology optimization of a piezoelectric loudspeakerFabian Wein
 
Engr 371 final exam april 2010
Engr 371 final exam april 2010Engr 371 final exam april 2010
Engr 371 final exam april 2010amnesiann
 
Ian.petrow【transcendental number theory】.
Ian.petrow【transcendental number theory】.Ian.petrow【transcendental number theory】.
Ian.petrow【transcendental number theory】.Tong Leung
 
Linear-Size Approximations to the Vietoris-Rips Filtration - Presented at Uni...
Linear-Size Approximations to the Vietoris-Rips Filtration - Presented at Uni...Linear-Size Approximations to the Vietoris-Rips Filtration - Presented at Uni...
Linear-Size Approximations to the Vietoris-Rips Filtration - Presented at Uni...Don Sheehy
 

Similar to Channel coding (20)

Case Study (All)
Case Study (All)Case Study (All)
Case Study (All)
 
Tele3113 wk1wed
Tele3113 wk1wedTele3113 wk1wed
Tele3113 wk1wed
 
Proximal Splitting and Optimal Transport
Proximal Splitting and Optimal TransportProximal Splitting and Optimal Transport
Proximal Splitting and Optimal Transport
 
Tele4653 l9
Tele4653 l9Tele4653 l9
Tele4653 l9
 
Unit 5.pdf
Unit 5.pdfUnit 5.pdf
Unit 5.pdf
 
Binary symmetric channel review
Binary symmetric channel reviewBinary symmetric channel review
Binary symmetric channel review
 
The gaussian minimum entropy conjecture
The gaussian minimum entropy conjectureThe gaussian minimum entropy conjecture
The gaussian minimum entropy conjecture
 
Madrid easy
Madrid easyMadrid easy
Madrid easy
 
Markov Tutorial CDC Shanghai 2009
Markov Tutorial CDC Shanghai 2009Markov Tutorial CDC Shanghai 2009
Markov Tutorial CDC Shanghai 2009
 
Multimedia Communication Lec02: Info Theory and Entropy
Multimedia Communication Lec02: Info Theory and EntropyMultimedia Communication Lec02: Info Theory and Entropy
Multimedia Communication Lec02: Info Theory and Entropy
 
Eccm 10
Eccm 10Eccm 10
Eccm 10
 
ECCM 2010 in Paris
ECCM 2010 in ParisECCM 2010 in Paris
ECCM 2010 in Paris
 
Acoustic near field topology optimization of a piezoelectric loudspeaker
Acoustic near field topology optimization of a piezoelectric loudspeakerAcoustic near field topology optimization of a piezoelectric loudspeaker
Acoustic near field topology optimization of a piezoelectric loudspeaker
 
Acoustic near field topology optimization of a piezoelectric loudspeaker
Acoustic near field topology optimization of a piezoelectric loudspeakerAcoustic near field topology optimization of a piezoelectric loudspeaker
Acoustic near field topology optimization of a piezoelectric loudspeaker
 
Deblurring in ct
Deblurring in ctDeblurring in ct
Deblurring in ct
 
Engr 371 final exam april 2010
Engr 371 final exam april 2010Engr 371 final exam april 2010
Engr 371 final exam april 2010
 
Ian.petrow【transcendental number theory】.
Ian.petrow【transcendental number theory】.Ian.petrow【transcendental number theory】.
Ian.petrow【transcendental number theory】.
 
Linear-Size Approximations to the Vietoris-Rips Filtration - Presented at Uni...
Linear-Size Approximations to the Vietoris-Rips Filtration - Presented at Uni...Linear-Size Approximations to the Vietoris-Rips Filtration - Presented at Uni...
Linear-Size Approximations to the Vietoris-Rips Filtration - Presented at Uni...
 
Ch6 information theory
Ch6 information theoryCh6 information theory
Ch6 information theory
 
Normal
NormalNormal
Normal
 

More from Piyush Mittal

More from Piyush Mittal (20)

Power mock
Power mockPower mock
Power mock
 
Design pattern tutorial
Design pattern tutorialDesign pattern tutorial
Design pattern tutorial
 
Reflection
ReflectionReflection
Reflection
 
Gpu archi
Gpu archiGpu archi
Gpu archi
 
Cuda Architecture
Cuda ArchitectureCuda Architecture
Cuda Architecture
 
Intel open mp
Intel open mpIntel open mp
Intel open mp
 
Intro to parallel computing
Intro to parallel computingIntro to parallel computing
Intro to parallel computing
 
Cuda toolkit reference manual
Cuda toolkit reference manualCuda toolkit reference manual
Cuda toolkit reference manual
 
Matrix multiplication using CUDA
Matrix multiplication using CUDAMatrix multiplication using CUDA
Matrix multiplication using CUDA
 
Java cheat sheet
Java cheat sheetJava cheat sheet
Java cheat sheet
 
Google app engine cheat sheet
Google app engine cheat sheetGoogle app engine cheat sheet
Google app engine cheat sheet
 
Git cheat sheet
Git cheat sheetGit cheat sheet
Git cheat sheet
 
Vi cheat sheet
Vi cheat sheetVi cheat sheet
Vi cheat sheet
 
Css cheat sheet
Css cheat sheetCss cheat sheet
Css cheat sheet
 
Cpp cheat sheet
Cpp cheat sheetCpp cheat sheet
Cpp cheat sheet
 
Ubuntu cheat sheet
Ubuntu cheat sheetUbuntu cheat sheet
Ubuntu cheat sheet
 
Php cheat sheet
Php cheat sheetPhp cheat sheet
Php cheat sheet
 
oracle 9i cheat sheet
oracle 9i cheat sheetoracle 9i cheat sheet
oracle 9i cheat sheet
 
Open ssh cheet sheat
Open ssh cheet sheatOpen ssh cheet sheat
Open ssh cheet sheat
 
Opencl cheet sheet
Opencl cheet sheetOpencl cheet sheet
Opencl cheet sheet
 

Recently uploaded

How to Make a Duplicate of Your Odoo 17 Database
How to Make a Duplicate of Your Odoo 17 DatabaseHow to Make a Duplicate of Your Odoo 17 Database
How to Make a Duplicate of Your Odoo 17 DatabaseCeline George
 
Multi Domain Alias In the Odoo 17 ERP Module
Multi Domain Alias In the Odoo 17 ERP ModuleMulti Domain Alias In the Odoo 17 ERP Module
Multi Domain Alias In the Odoo 17 ERP ModuleCeline George
 
Scientific Writing :Research Discourse
Scientific  Writing :Research  DiscourseScientific  Writing :Research  Discourse
Scientific Writing :Research DiscourseAnita GoswamiGiri
 
ICS2208 Lecture6 Notes for SL spaces.pdf
ICS2208 Lecture6 Notes for SL spaces.pdfICS2208 Lecture6 Notes for SL spaces.pdf
ICS2208 Lecture6 Notes for SL spaces.pdfVanessa Camilleri
 
Textual Evidence in Reading and Writing of SHS
Textual Evidence in Reading and Writing of SHSTextual Evidence in Reading and Writing of SHS
Textual Evidence in Reading and Writing of SHSMae Pangan
 
Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)
Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)
Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)lakshayb543
 
Beauty Amidst the Bytes_ Unearthing Unexpected Advantages of the Digital Wast...
Beauty Amidst the Bytes_ Unearthing Unexpected Advantages of the Digital Wast...Beauty Amidst the Bytes_ Unearthing Unexpected Advantages of the Digital Wast...
Beauty Amidst the Bytes_ Unearthing Unexpected Advantages of the Digital Wast...DhatriParmar
 
week 1 cookery 8 fourth - quarter .pptx
week 1 cookery 8  fourth  -  quarter .pptxweek 1 cookery 8  fourth  -  quarter .pptx
week 1 cookery 8 fourth - quarter .pptxJonalynLegaspi2
 
Narcotic and Non Narcotic Analgesic..pdf
Narcotic and Non Narcotic Analgesic..pdfNarcotic and Non Narcotic Analgesic..pdf
Narcotic and Non Narcotic Analgesic..pdfPrerana Jadhav
 
Student Profile Sample - We help schools to connect the data they have, with ...
Student Profile Sample - We help schools to connect the data they have, with ...Student Profile Sample - We help schools to connect the data they have, with ...
Student Profile Sample - We help schools to connect the data they have, with ...Seán Kennedy
 
4.16.24 21st Century Movements for Black Lives.pptx
4.16.24 21st Century Movements for Black Lives.pptx4.16.24 21st Century Movements for Black Lives.pptx
4.16.24 21st Century Movements for Black Lives.pptxmary850239
 
Reading and Writing Skills 11 quarter 4 melc 1
Reading and Writing Skills 11 quarter 4 melc 1Reading and Writing Skills 11 quarter 4 melc 1
Reading and Writing Skills 11 quarter 4 melc 1GloryAnnCastre1
 
DIFFERENT BASKETRY IN THE PHILIPPINES PPT.pptx
DIFFERENT BASKETRY IN THE PHILIPPINES PPT.pptxDIFFERENT BASKETRY IN THE PHILIPPINES PPT.pptx
DIFFERENT BASKETRY IN THE PHILIPPINES PPT.pptxMichelleTuguinay1
 
Unraveling Hypertext_ Analyzing Postmodern Elements in Literature.pptx
Unraveling Hypertext_ Analyzing  Postmodern Elements in  Literature.pptxUnraveling Hypertext_ Analyzing  Postmodern Elements in  Literature.pptx
Unraveling Hypertext_ Analyzing Postmodern Elements in Literature.pptxDhatriParmar
 
Active Learning Strategies (in short ALS).pdf
Active Learning Strategies (in short ALS).pdfActive Learning Strategies (in short ALS).pdf
Active Learning Strategies (in short ALS).pdfPatidar M
 
Decoding the Tweet _ Practical Criticism in the Age of Hashtag.pptx
Decoding the Tweet _ Practical Criticism in the Age of Hashtag.pptxDecoding the Tweet _ Practical Criticism in the Age of Hashtag.pptx
Decoding the Tweet _ Practical Criticism in the Age of Hashtag.pptxDhatriParmar
 
Q4-PPT-Music9_Lesson-1-Romantic-Opera.pptx
Q4-PPT-Music9_Lesson-1-Romantic-Opera.pptxQ4-PPT-Music9_Lesson-1-Romantic-Opera.pptx
Q4-PPT-Music9_Lesson-1-Romantic-Opera.pptxlancelewisportillo
 
How to Fix XML SyntaxError in Odoo the 17
How to Fix XML SyntaxError in Odoo the 17How to Fix XML SyntaxError in Odoo the 17
How to Fix XML SyntaxError in Odoo the 17Celine George
 
Congestive Cardiac Failure..presentation
Congestive Cardiac Failure..presentationCongestive Cardiac Failure..presentation
Congestive Cardiac Failure..presentationdeepaannamalai16
 
4.16.24 Poverty and Precarity--Desmond.pptx
4.16.24 Poverty and Precarity--Desmond.pptx4.16.24 Poverty and Precarity--Desmond.pptx
4.16.24 Poverty and Precarity--Desmond.pptxmary850239
 

Recently uploaded (20)

How to Make a Duplicate of Your Odoo 17 Database
How to Make a Duplicate of Your Odoo 17 DatabaseHow to Make a Duplicate of Your Odoo 17 Database
How to Make a Duplicate of Your Odoo 17 Database
 
Multi Domain Alias In the Odoo 17 ERP Module
Multi Domain Alias In the Odoo 17 ERP ModuleMulti Domain Alias In the Odoo 17 ERP Module
Multi Domain Alias In the Odoo 17 ERP Module
 
Scientific Writing :Research Discourse
Scientific  Writing :Research  DiscourseScientific  Writing :Research  Discourse
Scientific Writing :Research Discourse
 
ICS2208 Lecture6 Notes for SL spaces.pdf
ICS2208 Lecture6 Notes for SL spaces.pdfICS2208 Lecture6 Notes for SL spaces.pdf
ICS2208 Lecture6 Notes for SL spaces.pdf
 
Textual Evidence in Reading and Writing of SHS
Textual Evidence in Reading and Writing of SHSTextual Evidence in Reading and Writing of SHS
Textual Evidence in Reading and Writing of SHS
 
Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)
Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)
Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)
 
Beauty Amidst the Bytes_ Unearthing Unexpected Advantages of the Digital Wast...
Beauty Amidst the Bytes_ Unearthing Unexpected Advantages of the Digital Wast...Beauty Amidst the Bytes_ Unearthing Unexpected Advantages of the Digital Wast...
Beauty Amidst the Bytes_ Unearthing Unexpected Advantages of the Digital Wast...
 
week 1 cookery 8 fourth - quarter .pptx
week 1 cookery 8  fourth  -  quarter .pptxweek 1 cookery 8  fourth  -  quarter .pptx
week 1 cookery 8 fourth - quarter .pptx
 
Narcotic and Non Narcotic Analgesic..pdf
Narcotic and Non Narcotic Analgesic..pdfNarcotic and Non Narcotic Analgesic..pdf
Narcotic and Non Narcotic Analgesic..pdf
 
Student Profile Sample - We help schools to connect the data they have, with ...
Student Profile Sample - We help schools to connect the data they have, with ...Student Profile Sample - We help schools to connect the data they have, with ...
Student Profile Sample - We help schools to connect the data they have, with ...
 
4.16.24 21st Century Movements for Black Lives.pptx
4.16.24 21st Century Movements for Black Lives.pptx4.16.24 21st Century Movements for Black Lives.pptx
4.16.24 21st Century Movements for Black Lives.pptx
 
Reading and Writing Skills 11 quarter 4 melc 1
Reading and Writing Skills 11 quarter 4 melc 1Reading and Writing Skills 11 quarter 4 melc 1
Reading and Writing Skills 11 quarter 4 melc 1
 
DIFFERENT BASKETRY IN THE PHILIPPINES PPT.pptx
DIFFERENT BASKETRY IN THE PHILIPPINES PPT.pptxDIFFERENT BASKETRY IN THE PHILIPPINES PPT.pptx
DIFFERENT BASKETRY IN THE PHILIPPINES PPT.pptx
 
Unraveling Hypertext_ Analyzing Postmodern Elements in Literature.pptx
Unraveling Hypertext_ Analyzing  Postmodern Elements in  Literature.pptxUnraveling Hypertext_ Analyzing  Postmodern Elements in  Literature.pptx
Unraveling Hypertext_ Analyzing Postmodern Elements in Literature.pptx
 
Active Learning Strategies (in short ALS).pdf
Active Learning Strategies (in short ALS).pdfActive Learning Strategies (in short ALS).pdf
Active Learning Strategies (in short ALS).pdf
 
Decoding the Tweet _ Practical Criticism in the Age of Hashtag.pptx
Decoding the Tweet _ Practical Criticism in the Age of Hashtag.pptxDecoding the Tweet _ Practical Criticism in the Age of Hashtag.pptx
Decoding the Tweet _ Practical Criticism in the Age of Hashtag.pptx
 
Q4-PPT-Music9_Lesson-1-Romantic-Opera.pptx
Q4-PPT-Music9_Lesson-1-Romantic-Opera.pptxQ4-PPT-Music9_Lesson-1-Romantic-Opera.pptx
Q4-PPT-Music9_Lesson-1-Romantic-Opera.pptx
 
How to Fix XML SyntaxError in Odoo the 17
How to Fix XML SyntaxError in Odoo the 17How to Fix XML SyntaxError in Odoo the 17
How to Fix XML SyntaxError in Odoo the 17
 
Congestive Cardiac Failure..presentation
Congestive Cardiac Failure..presentationCongestive Cardiac Failure..presentation
Congestive Cardiac Failure..presentation
 
4.16.24 Poverty and Precarity--Desmond.pptx
4.16.24 Poverty and Precarity--Desmond.pptx4.16.24 Poverty and Precarity--Desmond.pptx
4.16.24 Poverty and Precarity--Desmond.pptx
 

Channel coding

  • 1. Introduction to Information theory channel capacity and models A.J. Han Vinck University of Essen May 2011
  • 2. This lecture  Some models  Channel capacity  Shannon channel coding theorem  converse
  • 3. some channel models Input X P(y|x) output Y transition probabilities memoryless: - output at time i depends only on input at time i - input and output alphabet finite
  • 4. Example: binary symmetric channel (BSC) 1-p Error Source 0 0 E p X Y = X ⊕E + 1 1 Input Output 1-p E is the binary error sequence s.t. P(1) = 1-P(0) = p X is the binary information sequence Y is the binary output sequence
  • 5. from AWGN to BSC p Homework: calculate the capacity as a function of A and σ2
  • 6. Other models 1-e 0 0 (light on) 0 0 e X p Y 1-p 1 (light off) 1 e 1 1 P(X=0) = P0 P(X=0) = P0 Z-channel (optical) Erasure channel (MAC)
  • 7. Erasure with errors 1-p-e 0 0 e p p e 1 1 1-p-e
  • 8. burst error model (Gilbert-Elliot) Random error channel; outputs independent Error Source P(0) = 1- P(1); Burst error channel; outputs dependent P(0 | state = bad ) = P(1|state = bad ) = 1/2; Error Source P(0 | state = good ) = 1 - P(1|state = good ) = 0.999 State info: good or bad transition probability Pgb Pgg good bad Pbb Pbg
  • 9. channel capacity: I(X;Y) = H(X) - H(X|Y) = H(Y) – H(Y|X) (Shannon 1948) X Y H(X) channel H(X|Y) max I(X; Y) = capacity P( x ) notes: capacity depends on input probabilities because the transition probabilites are fixed
  • 10. Practical communication system design Code book Code receive message word in estimate 2k channel decoder Code book with errors n There are 2k code words of length n k is the number of information bits transmitted in n channel uses
  • 11. Channel capacity Definition: The rate R of a code is the ratio k/n, where k is the number of information bits transmitted in n channel uses Shannon showed that: : for R ≤ C encoding methods exist with decoding error probability 0
  • 12. Encoding and decoding according to Shannon Code: 2k binary codewords where p(0) = P(1) = ½ Channel errors: P(0 →1) = P(1 → 0) = p i.e. # error sequences ≈ 2nh(p) Decoder: search around received sequence for codeword with ≈ np differences space of 2n binary sequences
  • 13. decoding error probability 1. For t errors: |t/n-p|> Є → 0 for n → ∞ (law of large numbers) 2. > 1 code word in region (codewords random) 2 nh ( p) P(> 1) ≈ (2 k − 1) 2n → 2 − n (1− h ( p)− R ) = 2 − n (C BSC − R ) → 0 k for R = < 1 − h (p) n and n → ∞
  • 14. channel capacity: the BSC 1-p I(X;Y) = H(Y) – H(Y|X) 0 0 the maximum of H(Y) = 1 X p Y since Y is binary 1 1 H(Y|X) = h(p) 1-p = P(X=0)h(p) + P(X=1)h(p) Conclusion: the capacity for the BSC CBSC = 1- h(p) Homework: draw CBSC , what happens for p > ½
  • 15. channel capacity: the BSC Explain the behaviour! 1.0 Channel capacity 0.5 1.0 Bit error p
  • 16. channel capacity: the Z-channel Application in optical communications 0 0 (light on) H(Y) = h(P0 +p(1- P0 ) ) X p Y H(Y|X) = (1 - P0 ) h(p) 1-p 1 (light off) 1 For capacity, P(X=0) = P0 maximize I(X;Y) over P0
  • 17. channel capacity: the erasure channel Application: cdma detection 1-e 0 0 I(X;Y) = H(X) – H(X|Y) e H(X) = h(P0 ) X Y H(X|Y) = e h(P0) e 1 1 Thus Cerasure = 1 – e P(X=0) = P0 (check!, draw and compare with BSC and Z)
  • 18. Erasure with errors: calculate the capacity! 1-p-e 0 0 e p p e 1 1 1-p-e
  • 19. 0 0 1/3 example 1 1 1/3 2 2  Consider the following example  For P(0) = P(2) = p, P(1) = 1-2p H(Y) = h(1/3 – 2p/3) + (2/3 + 2p/3); H(Y|X) = (1-2p)log23 Q: maximize H(Y) – H(Y|X) as a function of p Q: is this the capacity? hint use the following: log2x = lnx / ln 2; d lnx / dx = 1/x
  • 20. channel models: general diagram P1|1 y1 x1 P2|1 Input alphabet X = {x1, x2, …, xn} P1|2 x2 y2 P2|2 Output alphabet Y = {y1, y2, …, ym} : Pj|i = PY|X(yj|xi) : : : : In general: : xn calculating capacity needs more Pm|n theory ym The statistical behavior of the channel is completely defined by the channel transition probabilities Pj|i = PY|X(yj|xi)
  • 21. * clue: I(X;Y) is convex ∩ in the input probabilities i.e. finding a maximum is simple
  • 22. Channel capacity: converse For R > C the decoding error probability > 0 Pe k/n C
  • 23. Converse: For a discrete memory less channel channel Xi Yi n n n n I ( X ; Y ) = H (Y ) − ∑ H (Yi | X i ) ≤ ∑ H (Yi ) − ∑ H (Yi | X i ) = ∑ I ( X i ; Yi ) ≤ nC n n n i =1 i =1 i =1 i =1 Source generates one source encoder channel decoder out of 2k equiprobable m Xn Yn m‘ messages Let Pe = probability that m‘ ≠ m
  • 24. converse R := k/n k = H(M) = I(M;Yn)+H(M|Yn) 1 – C n/k - 1/k ≤ Pe ≤ I(Xn;Yn) + 1 + k Pe ≤ nC + 1 + k Pe Pe ≥ 1 – C/R - 1/nR Hence: for large n, and R > C, the probability of error Pe > 0
  • 25. We used the data processing theorem Cascading of Channels I(X;Z) X Y Z I(X;Y) I(Y;Z) The overall transmission rate I(X;Z) for the cascade can not be larger than I(Y;Z), that is: I(X; Z) ≤ I(Y; Z)
  • 26. Appendix: Assume: binary sequence P(0) = 1 – P(1) = 1-p t is the # of 1‘s in the sequence Then n → ∞ , ε > 0 Weak law of large numbers Probability ( |t/n –p| > ε ) → 0 i.e. we expect with high probability pn 1‘s
  • 27. Appendix: Consequence: 1. n(p- ε) < t < n(p + ε) with high probability n ( p + ε) n n 2. ∑   ≈ 2nε  ≈ 2nε2 nh ( p) t  pn  n ( p −ε)     1 log 2nε  n  lim n 2   → h ( p) h (p) = − p log 2 p − (1 − p) log 2 (1 − p) 3. n→ ∞  pn    Homework: prove the approximation using ln N! ~ N lnN for N large. N −N Or use the Stirling approximation: N ! → 2π N N e
  • 28. Binary Entropy: h(p) = -plog2p – (1-p) log2 (1-p) 1 h 0.9 Note: 0.8 h(p) = h(1-p) 0.7 0.6 0.5 0.4 0.3 0.2 0.1 0 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 p
  • 29. Capacity for Additive White Gaussian Noise Noise Input X Output Y Cap := sup [H(Y) − H( Noise)] p( x ) x 2 ≤S / 2 W W is (single sided) bandwidth Input X is Gaussian with power spectral density (psd) ≤S/2W; Noise is Gaussian with psd = σ2noise Output Y is Gaussian with psd = σy2 = S/2W + σ2noise For Gaussian Channels: σy2 = σx2 +σnoise2
  • 30. Noise X Y X Y Cap = 1 log 2 (2πe(σ 2 + σ 2 )) − 1 log 2 (2πeσ 2 ) bits / trans. 2 x noise 2 noise σ2 + σ2 noise x = 1 log 2 ( 2 ) bits / trans. σ 2 noise σ2 + S / 2W noise Cap = W log 2 ( ) bits / sec . σ2 noise 1 −z2 / 2 σ2 p(z) = e z ; H(Z) = 2 log2 (2πeσ2 ) bits 1 z 2πσ2 z
  • 31. Middleton type of burst channel model 0 0 1 1 Transition probability P(0) channel 1 channel 2 Select channel k … with probability channel k has Q(k) transition probability p(k)
  • 32. Fritzman model: multiple states G and only one state B Closer to an actual real-world channel 1-p G1 … Gn B Error probability 0 Error probability h
  • 33. Interleaving: from bursty to random bursty Message interleaver channel interleaver -1 message encoder decoder „random error“ Note: interleaving brings encoding and decoding delay Homework: compare the block and convolutional interleaving w.r.t. delay
  • 34. Interleaving: block Channel models are difficult to derive: - burst definition ? - random and burst errors ? for practical reasons: convert burst into random error read in row wise 1 0 1 0 1 transmit 0 1 0 0 0 0 0 0 1 0 column wise 1 0 0 1 1 1 1 0 0 1
  • 35. De-Interleaving: block read in column 1 0 1 e 1 read out wise 0 1 e e 0 0 0 e 1 0 this row contains 1 error 1 0 e 1 1 row wise 1 1 e 0 1
  • 36. Interleaving: convolutional input sequence 0 input sequence 1 delay of b elements ••• input sequence m-1 delay of (m-1)b elements in Example: b = 5, m = 3 out