Get started !
online LTE test
online C test

Updated or New
GPRS RAN refresh notes New
GSM RAN refresh notes New



About
Feedback
Information Theory
Modulation
Multiple Access
DSP (wip)
OSI Model
Data Link layer
SS7
Word about ATM
GSM
GPRS
UMTS
WiMAX
LTE
CV2X
5G
Standard Reference
Reference books
Resources on Web
Miscellaneous
Mind Map
Magic MSC tool
Bar graph tool
C programming
C++ programming
Perl resources
Python programming
Javascript/HTML
MATLAB
ASCII table
Project Management

another knowledge site

3GPP Modem
Simulator


Sparkle At Office comic strip

Capacity theorem

Capacity theorem [Under Information theory > Shannon's paper]

Next topic is Entropy of information. Before we look at it, let us jump ahead and see where Entropy has lead us.

Shannon's well known capacity equation (theorem 17, page 43) is:

Capacity = BW log2(1 + Signal/Noise ratio)

Here are overall steps that lead to above theorem:

R = H(x)Hy(x)

R is actual Rate of transmission, H(x) is Entropy of source output x, Hy(x) is conditional Entropy (also called equivocation) of received signal.

Capacity = Max of R = Max of ( H(x)Hy(x) )

If there exist white thermal noise n, H(x, y) = H(x, n)
leading to H(y) + Hy(x) = H(x) + H(n) as x and n are independent
making R = H(x)Hy(x) = H(y)H(n)

Capacity would then be = Max of ( H(y)H(n) )
Maximisation of Capacity would mean maximisation of H(y).

Now, for certain average power P of transmitted signal and average (independent) white noise power N, maximum H(y) will happen for power (P+N) and it is W loge (P+N).

H(n) is W logeN.

Capacity = Max of H(y)H(n) = W log( 1 +( P/N ) )

In next article, we will look at concept of Entropy and then later try to follow above steps in little more detailed manner.

References: A Mathematical Theory of Communication by Claude E. Shannon, An Introduction to Information Theory by John R. Pierce.

Copyright © Samir Amberkar 2010-11§

Shannon's paper - Capacity definition « Theory Index » Entropy definition