Get started !
online LTE test
online C test

Updated or New
5G NR Data Rate calculator New
5G NR TBS calculator New
5G NR UE Registration New
Python programming
C++ programming
MIMO for 3GPP Developer - 2
Uplink power control
MIMO for 3GPP Developer
NR ARFCN and GSCN
5GS Interfaces



About
Feedback
Information Theory
Modulation
Multiple Access
DSP (wip)
OSI Model
Data Link layer
SS7
Word about ATM
GSM
GPRS
UMTS
WiMAX
LTE
CV2X
5G
Standard Reference
Reference books
Resources on Web
Miscellaneous
Mind Map
Magic MSC tool
Bar graph tool
C programming
C++ programming
Perl resources
Python programming
Javascript/HTML
MATLAB
ASCII table
Project Management

another knowledge site

3GPP Modem
Simulator


Sparkle At Office comic strip

Continuous symbols

Continuous symbols [Under Information theory > Shannon's paper]

In article, Entropy definition, we defined entropy for discrete symbols as:

H = — K pi log pi

The same can be extended when we have "continuous" symbols (values can be infinitesimally small dx):

H(x) = — p(x) log p(x) dx

More generalised equation would be when x can take any value from — ∞ to ∞ :

To solve above equation, we will need probability distribution, p(x). We know that addition of all probabilities amount to 1 i.e.

p(x) dx = 1

To proceed further, we need to put in a bit of mathematics. Let us take x having limited values, with standard deviation fixed at σ. In such case,

σ2 = p(x) x2 dx

With above two constraints, mathematically it can be shown that maximum value of H(x) is achieved when p(x) is:

Maximum value of H(x) is:

H(x) = log(2πeσ2 ) bits per symbol

(square root is for complete 2πeσ2)

Nyquist states that for reproduction of a periodic waveform, sampling frequency should be at least double the frequency of the waveform. So we have signal of bandwidth W Hz, sampling frequency would be 2W. Taking this as symbol rate, entropy - for information signal having Gaussian probability distribution - in bits per second would be:

H(x) = 2W log(2πeσ2 ) bits per second

That is:

H(x) = W log( 2πeσ2 )

An example of signal having Gaussian probability distribution is white thermal noise.

Standard variation σ denote amplitude values, from our knowledge that energy contained in a symbol of value v is equivalent to v2, we can say σ2 is equivalent to average energy rate i.e. average noise power N (calculated for sufficiently long duration).

So we can say that for a given average power N, white thermal noise has maximum possible entropy of:

H(n) = W log(2πeN)

Above equation belong to page 38, point 9 of Shannon's paper.

References: A Mathematical Theory of Communication by Claude E. Shannon, An Introduction to Information Theory by John R. Pierce.

Copyright © Samir Amberkar 2010-11§

Rate in terms of noise entropy « Theory Index » Deducing capacity theorem