up

HW1, EE 503 (Information theory and coding) spring 2010

by Nasser M. Abbasi

June 15, 2014

Contents

1 Problem 3.2
2 Problem 3.3
Date due and handed in Feb. 11,2010

1 Problem 3.2

Consider a source which generates 2 symbols x1,x2   with probability p(x1)= p and p(x2)= 1− p = q . Now consider a sequence of 2 outputs as a single symbol   2
X  = {{x1x1},{x1x2} ,{x2x1},{x2x2}} , Assuming that consecutive outputs from the source are statically independent, show directly that H (X2)=  2H (X )

Solution

First find H (X )

pict

Where M in this case is 2. The above becomes

pict

Hence 2H (x)  becomes

pict

Now, we conside X2. Below we write X 2   with the probability of each 2 outputs as single symbol on top of each. Notice that since symbols are statistically independent, then p(x1x2) = p(x1)p(x2)  , we obtain

     (                   )
     {  p2   pq  qp   q2 }
X 2 =  ◜x◞1◟x◝1,◜x◞1◟◝x2,◜x◞2◟x◝1,◜x◞2◟x◝2
     (                   )

Hence since

  (  )   N       1
H  X 2 = ∑ pilog2 --
         i=1      pi

Where N = 4  now, the above becomes

pict

Now we will expand the terms labeled above as A,B,C and simplify, then we will obtain (1) showing the desired results.

pict
pict
pict

Substitute the result we found for A,B,C back into (2) we obtain

pict

But the above can be written as

pict

Compare (4) and (1), we see they are the same.

Hence

|-----------------|
| H (X2)=  2H(X ) |
-------------------

2 Problem 3.3

A source emits sequence of independent symbols from  alphabet X of symbols x1,⋅⋅⋅,x5   with probabilities { 1 1 1 3--5}
  4,8,8,16,16 , find the entropy of the source alphabet

Solution

        M       1
H (X )= ∑ pilog2--
       i=1     pi

Where M  = 5  , hence the above becomes

pict

But log210 = 3.32193  , hence the above becomes

pict

To verify, we know that H (X)  must be less than or equal to log2M where M = 5  in this case, hence H (X) ≤ log25  or H (X )≤ 2.32193  , therefore, our result above agrees with this upper limit restriction.