Get started !
online LTE test
online C test

Updated or New
5G NR Data Rate calculator New
5G NR TBS calculator New
5G NR UE Registration New
Python programming
C++ programming
MIMO for 3GPP Developer - 2
Uplink power control
MIMO for 3GPP Developer
NR ARFCN and GSCN
5GS Interfaces



About
Feedback
Information Theory
Modulation
Multiple Access
DSP (wip)
OSI Model
Data Link layer
SS7
Word about ATM
GSM
GPRS
UMTS
WiMAX
LTE
CV2X
5G
Standard Reference
Reference books
Resources on Web
Miscellaneous
Mind Map
Magic MSC tool
Bar graph tool
C programming
C++ programming
Perl resources
Python programming
Javascript/HTML
MATLAB
ASCII table
Project Management

another knowledge site

3GPP Modem
Simulator


Sparkle At Office comic strip

Deducing capacity theorem

Deducing capacity theorem [Under Information theory > Shannon's paper]

Let us combine our discussions from last two articles to deduce Channel capacity assuming white thermal noise and certain average power for both transmitted signals and noise.

From article on Rate in terms of noise entropy,

When transmitted signal and noise are independent and received signal is sum of transmitted signal and noise, rate of transmission is:

R = H(y)H(n)

If we would like to know the capacity of this (noisy) channel, we will have maximise the rate of transmission i.e.

C = Rmax = ( H(y)H(n) )max

n being independent of x, adjustable parameter in above equation is H(y). So,

C = H(y)maxH(n)

From article on Continuous symbols,

We saw that taking N as average noise power, maximum possible entropy of noise power is:

H(n) = W log(2πeN)

As seen earlier (mathematically), for achieving maximum entropy, signal should form Gaussian distribution. To achieve maximum H(y), we can control x so that y forms Gaussian distribution corresponding to addition of average noise power N and average transmitted power (say P). The maximum entropy achieve would be:

H(y)max = W log(2πe (P+N) )

Putting above values in capacity equation, we get (theorem 16, page 43):

C = W log(2πe (P+N) ) W log(2πeN)

That is:

References: A Mathematical Theory of Communication by Claude E. Shannon, An Introduction to Information Theory by John R. Pierce.

Copyright © Samir Amberkar 2010-11§

Continuous symbols « Theory Index » Another angle