CHAPTER 2

 < Day Day Up > 



  1. For a source that produces 42 symbols with equal probability, the entropy of the source is

    H = log2 42 bits/symbol

    = 5.55 bits/symbol

  2. For a source that produces two symbols A and B with probabilities of 0.6 and 0.4, respectively, the entropy is

    H = {0.6 log2 0.6 + 0.4 log2 0.4} = 0.970 bits/symbol

  3. In ASCII, each character is represented by seven bits. The frequency of occurrence of the English letters is not taken into consideration at all. If the frequency of occurrence is taken into consideration, then the most frequently occurring letters have to be represented by small code words (such as 2 bits) and less frequently occurring letters have to be represented by long code words. According to Shannon's theory, ASCII is not an efficient coding technique.

    However, note that if an efficient coding technique is followed, then a lot of additional processing is involved, which causes delay in decoding the text.

  4. You can write a program that obtains the frequency of occurrence of the English letters. The program takes a text file as input and produces the frequency of occurrence for all the letters and spaces. You can ignore the punctuation marks. You need to convert all letters either into capital letters or small letters. Based on the frequencies, if you apply Shannon's formula for entropy, you will get a value close to 4.07 bits/symbol.

  5. You can modify the above program to calculate the frequencies of two letter combinations (aa, ab, ac,ba, bb, zy, zz). Again, if you apply the formula, you will get a value close to 3.36 bits/symbol.



 < Day Day Up > 



Principles of Digital Communication Systems and Computer Networks
Principles Digital Communication System & Computer Networks (Charles River Media Computer Engineering)
ISBN: 1584503297
EAN: 2147483647
Year: 2003
Pages: 313
Authors: K V Prasad

flylib.com © 2008-2017.
If you may any questions please contact us: flylib@qtcs.net