Previous Table of Contents Next


The Shannon-Fano tree shown in Figure 3.1 was developed from the table of symbol frequencies shown next.


Symbol Count

A 15
B 7
C 6
D 6
E 5

Putting the dividing line between symbols B and C assigns a count of 22 to the upper group and 17 to the lower, the closest to exactly half. This means that A and B will each have a code that starts with a 0 bit, and C, D, and E are all going to start with a 1 as shown:


Symbol Count
A 15 0
B 7 0

First division
C 6 1
D 6 1
E 5 1

Subsequently, the upper half of the table gets a new division between A and B, which puts A on a leaf with code 00 and B on a leaf with code 01. After four division procedures, a table of codes results. In the final table, the three symbols with the highest frequencies have all been assigned 2-bit codes, and two symbols with lower counts have 3-bit codes as shown next.


Symbol Count
A 15 0 0

Second division
B 7 0 1

First division
C 6 1 0

Third division
D 6 1 1 0

Fourth division
E 5 1 1 1

That symbols with the higher probability of occurence have fewer bits in their codes indicates we are on the right track. The formula for information content for a given symbol is the negative of the base two logarithm of the symbol’s probability. For our theoretical message, the information content of each symbol, along with the total number of bits for that symbol in the message, are found in the following table.


Symbol Count Info Cont. Info Bits

A 15 1.38 20.68
B 7 2.48 17.35
C 6 2.70 16.20
D 6 2.70 16.20
E 5 2.96 14.82

The information for this message adds up to about 85.25 bits. If we code the characters using 8-bit ASCII characters, we would use 39 × 8 bits, or 312 bits. Obviously there is room for improvement.

When we encode the same data using Shannon-Fano codes, we come up with some pretty good numbers, as shown below.


Symbol Count Info Cont. Info Bits SF Size SF Bits

A 15 1.38 20.68 2 30
B 7 2.48 17.35 2 14
C 6 2.70 16.20 2 12
D 6 2.70 16.20 3 18
E 5 2.96 14.82 3 15

With the Shannon-Fano coding system, it takes only 89 bits to encode 85.25 bits of information. Clearly we have come a long way in our quest for efficient coding methods. And while Shannon-Fano coding was a great leap forward, it had the unfortunate luck to be quickly superseded by an even more efficient coding system: Huffman coding.


Previous Table of Contents Next