Get started !
online LTE test
online C test

Updated or New
5G NR Data Rate calculator New
5G NR TBS calculator New
5G NR UE Registration New
Python programming
C++ programming
MIMO for 3GPP Developer - 2
Uplink power control
MIMO for 3GPP Developer
NR ARFCN and GSCN
5GS Interfaces



About
Feedback
Information Theory
Modulation
Multiple Access
DSP
OSI Model
Data Link layer
SS7
Word about ATM
GSM
GPRS
UMTS
WiMAX
LTE
CV2X
5G
Standard Reference
Reference books
Resources on Web
Miscellaneous
Mind Map
Magic MSC tool
Bar graph tool
C programming
C++ programming
Perl resources
Python programming
Javascript/HTML
MATLAB
ASCII table
Project Management

another knowledge site

3GPP Modem
Simulator


Sparkle At Office comic strip

Shannon's paper - Capacity definition

Shannon's paper - Capacity definition [Under Information theory]

In 1948, Claude E. Shannon published a paper, "A Mathematical Theory of Communication" which went on to form an integral part of foundation of (mathematics of) telecommunication as we know today. Many terms used on Shannon in the paper are now standard in telecommunication. In following chapters, we will look at theorems, ideas described by Shannon in this paper (not necessarily in the same order as presented in paper). The article is available in both PDF and book format.

Capacity of discrete channel

Discrete channel is one that carry symbols belonging to a finite set (like digital wireless transmission wherein different symbols represent different combinations of certain finite number of 0s and 1s). Giving reference to earlier work done by Nyquist and Hartley, Shannon defines capacity of discrete channel as (page 3):

C = Lim T → ∞ log N(T) / T

N(T) is number of allowed symbols of duration T. Symbols can be of different durations.

Also, symbols are assumed to have been devised in such a way that in whichever allowed order they are arranged, they are possible to be decoded with no ambiguity.

Above equation defines capacity for a general case. For insight into this equation, let us take a special case wherein symbols are of constant duration.

Say we choose each symbol of n bits of length with symbol rate as s symbols per second. Capacity of channel would of course be ns bits per second. Let us see if above equation gives us the same answer.

s symbols take 1 second to transfer. If T is duration of 1 symbol, T would be 1/s.

N(T) would be number of possible symbols from n bits i.e. 2 ^ n.

So log2N(T)/T = log2(2 ^ n) /T = n s

Lim T → ∞ has no impact as T is same (and finite) for all symbols.

This is same as our earlier answer.

Special case showed the significance of log in capacity equation. Capacity is logarithmically related to number of different symbols that can be transferred. In other words, if duration of symbols is kept constant, if capacity increased (more bits in same duration), number of different symbols (that can be transferred in the duration) increase in the order of power of 2.

Of course, if n is constant, capacity is inversely proportional to duration of symbol (capacity is directly proporation to symbol rate and T = 1/s).

Significance of Lim T → ∞ is not evident in special case as T is same (and finite) there for all symbols.

To have more insight into above equation, let us take a simple example of transmission of English language.

For transmitting Engligh language, we can map each alphabet (and space) to certain signal (say certain combination of 0s and 1s). As there are 26 alphabets and a space, we would need at least 5 bits. We can map one alphabet to one unique combination of 5 bits. With this method English language transmission rate would be proportional to bit rate. Now we know that letter 'e' is most found in English, so may be it is good idea that instead of mapping 'e' to a combination of 5 bits, we can actually map to a combination of smaller number of bits. This way we will have better English language transmission rate.

In general, for improving English language transmission rate, we can assign smaller number of bits representation to alphabet which is more probable than the alphabet which has lower probability.

Example is:

e → 0
a → 10
i → 100

.

.

.

If you notice now, we have symbols of different duration. Since duration keep on increasing as we keep on adding alphabets, if we want to consider symbols of all durations in calculating capacity, we need to introduce Lim T → ∞.

References: A Mathematical Theory of Communication by Claude E. Shannon, An Introduction to Information Theory by John R. Pierce.

Copyright © Samir Amberkar 2010-11§ §

. Theory Index » Capacity theorem