Capacity theorem [Under Information
theory > Shannon's paper]
Next topic is Entropy of information. Before we look at it, let us jump ahead and see where Entropy has lead us.
Shannon's well known capacity equation (theorem 17, page 43) is:
Capacity = BW log2(1 + Signal/Noise ratio)
Here are overall steps that lead to above theorem:
R = H(x) — Hy(x)
R is actual Rate of transmission, H(x) is Entropy of source output x, Hy(x) is conditional Entropy (also called equivocation) of received signal.
Capacity = Max of R = Max of ( H(x) — Hy(x) )
If there exist white thermal noise n, H(x, y) = H(x,
n)
leading to H(y) + Hy(x) = H(x)
+ H(n) as x and n are independent
making R
= H(x) — Hy(x) = H(y) — H(n)
Capacity would then be = Max of ( H(y) — H(n) )
Maximisation
of Capacity would mean maximisation of H(y).
Now, for certain average power P of transmitted signal and average (independent) white noise power N, maximum H(y) will happen for power (P+N) and it is W log 2πe (P+N).
H(n) is W log 2πeN.
Capacity = Max of H(y) — H(n) = W log( 1 +( P/N ) )
In next article, we will look at concept of Entropy and then later try to follow above steps in little more detailed manner.
References: A Mathematical Theory of Communication by Claude E. Shannon, An Introduction to Information Theory by John R. Pierce.
Copyright © Samir Amberkar 2010-11 | § |
Shannon's paper - Capacity definition « | Theory Index | » Entropy definition |