Shannon’s theorem then says that any channel with this capacity
Can carry the report, and that it cannot be carried by any channel
With less than this capacity. It also says that a coding always exists
By which the channel can be so used.
It was, perhaps, obvious enough that high-speed channels could
Report more than slow; what is important about this theorem is,
First, its great generality (for it makes no reference to any specific
Machinery, and therefore applies to telegraphs, nerve-fibres, con-
Versation, equally) and secondly its quantitative rigour. Thus, if
The pond were far in the hills, the question might occur whether
Smoke signals could carry the report. Suppose a distinct puff
Could be either sent or not sent in each quarter-minute, but not
Faster. The entropy per symbol is here I bit, and the channel’s
Capacity is therefore 4 bits per minute. Since 4 is greater than 2 53,
The channel can do the reporting, and a code can be found, turning
Positions to puffs, that will carry the information.
Shannon has himself constructed an example which shows
Exquisitely the exactness of this quantitative law. Suppose a
183
A N I N T R O D UC T I O N T O C Y B E R NE T I C S
I N CESSA N T TR AN SMI SSIO N
Source is producing letters A, B, C, D with frequencies in the ratio
Of 4, 2, 1, 1 respectively, the successive symbols being independ-
ent. A typical portion of the sequence would be … B A A B D A
A A A B C A B A A D A .... At equilibrium the relative frequen-
Cies of A, B, C, D would be 1/2, 1/4, 1/8, 1/8 respectively, and the
Entropy is 14 bits per step (i.e. per letter).
Now a channel that could produce, at each step, any one of four
States without constraint would have a capacity of 2 bits per step.
Shannon’s theorem says that there must exist a coding that will
Enable the latter channel (of capacity 2 bits per step) to transmit
Such a sequence (with entropy 1 3/4 bits per step) so that any long
Message requires fewer steps in the ratio of 2 to 1 3/4, i.e. of 8 to
The coding, devised by Shannon, that achieves this is as fol-
Lows. First code the message by
Ex. 2: Printed English has an entropy of about 10 bits per word. We can read
|
|
About 200 words per minute. Give a lower bound to the channel capacity of
The optic nerve.
Ex. 3: If a pianist can put each of ten fingers on any one of three notes, and can
Do this 300 times a minute, find a lower bound to the channel capacity of the
Nerves to the upper limbs.
Ex. 4: A bank’s records, consisting of an endless sequence of apparently random
Digits, O to 9, are to be encoded into Braille for storage. If 10,000 digits are
To be stored per hour, how fast must the Braille be printed if optimal coding
Is used? (Hint: There are 64 symbols in the Braille “alphabet”.)
One more example will be given, to show the astonishing
Power that Shannon’s method has of grasping the essentials in
Communication. Consider the system, of states a, b, c, d, with tran-
Sition probabilities
↓
↓
E.g. the message above,
A
0
B C D
10 110 111
a
0
0.6
0.4
0
b
0
0.6
0.4
0
c
0.3
0
0
0.7
d
0.3
0
0
0.7
↓ 100010111000010110010001110
B . AAB . D . . AAAAB . C . . AB . AAD . . A
a
b
c
d
Now divide the lower line into pairs and re-code into a new set of
Letters by
00 01 10 11
E F G H
These codes convert any message in “A to D” into the letters “E
To H”, and conversely, without ambiguity. What is remarkable is
That if we take a typical set of eight of the original letters (each
Represented with its typical frequency) we find that they can be
Transmitted as seven of the new:
A typical sequence would be
…b b b c a b c a b b c d d a c d a b c a c d d d d d d a b b…
|
|
The equilibrial probabilities are 6/35, 9/35, 6/35, 14/35 respec-
Tively. The entropy is soon found to be 0.92 bits per letter. Now
Suppose that the distinction between a and d is lost, i.e. code by
Abcd
XbcX
Surely some information must be lost? Let us see. There are
Now only three states X, b, c, where X means “either a or d”. Thus
the previous message would now start … b b b c X b c X b b c X
Дата добавления: 2019-11-16; просмотров: 236; Мы поможем в написании вашей работы! |
Мы поможем в написании ваших работ!