Get started !
online LTE test
online C test

Updated or New
5G NR Data Rate calculator New
5G NR TBS calculator New
5G NR UE Registration New
Python programming
C++ programming
MIMO for 3GPP Developer - 2
Uplink power control
MIMO for 3GPP Developer
NR ARFCN and GSCN
5GS Interfaces



About
Feedback
Information Theory
Modulation
Multiple Access
DSP (wip)
OSI Model
Data Link layer
SS7
Word about ATM
GSM
GPRS
UMTS
WiMAX
LTE
CV2X
5G
Standard Reference
Reference books
Resources on Web
Miscellaneous
Mind Map
Magic MSC tool
Bar graph tool
C programming
C++ programming
Perl resources
Python programming
Javascript/HTML
MATLAB
ASCII table
Project Management

another knowledge site

3GPP Modem
Simulator


Sparkle At Office comic strip

Uncertainty and Entropy

Uncertainty and Entropy [Under Information theory > Shannon's paper]

In last article, we considered an example with following two cases:

A) 32 symbols with equal probability (1/32 each)
B) 32 symbols, Sx with probability of 0.5 and rest with (0.5/31) each

Entropy for case A is 5 whereas for case B it is 3.49. What is the difference between above two cases that makes entropies for them different ?

We already know the answer: in case B, we choose to give smallest bit representation to Sx. We do that because higher number occurences of Sx . In other words, more certainty (or lesser uncertainty) is attached to information produced by source of case B than that by case A. So it seems entropy is measuring this uncertainty. Higher entropy value indicate more uncertainty in the symbols produced.

Since we have mathematical euqation to measure uncertainty, we can further verify our understanding with following two points:

1) When we are fully certain, entropy should be lowest (i.e. zero as per the equation)
2) When we are fully uncertain, entropy should be highest (i.e. K log2N as per the equation)

When we are fully certain, it mean there is only one symbol and so only one bit representation is needed. Entropy for one bit presentation is H = — K (1) log2(1) = 0.

When we are fully uncertain, we assign same number of bit representation to each symbol. That is the probability of each symbol becomes equal. Entropy for such case is
H = — K (N) (1/N) log2(1/N) = K log2N

In next article, we will try to relate Entropy with Capacity of transmission.

References: A Mathematical Theory of Communication by Claude E. Shannon, An Introduction to Information Theory by John R. Pierce.

Copyright © Samir Amberkar 2010-11§

Entropy definition « Theory Index » Capacity and Entropy