Get started !
online LTE test
online C test

Updated or New
5G NR Data Rate calculator New
5G NR TBS calculator New
5G NR UE Registration New
Python programming
C++ programming
MIMO for 3GPP Developer - 2
Uplink power control
MIMO for 3GPP Developer
NR ARFCN and GSCN
5GS Interfaces



About
Feedback
Information Theory
Modulation
Multiple Access
DSP (wip)
OSI Model
Data Link layer
SS7
Word about ATM
GSM
GPRS
UMTS
WiMAX
LTE
CV2X
5G
Standard Reference
Reference books
Resources on Web
Miscellaneous
Mind Map
Magic MSC tool
Bar graph tool
C programming
C++ programming
Perl resources
Python programming
Javascript/HTML
MATLAB
ASCII table
Project Management

another knowledge site

3GPP Modem
Simulator


Sparkle At Office comic strip

Capacity and Entropy

Capacity and Entropy [Under Information theory > Shannon's paper]

In 3rd article, we took Entropy as bits per symbol. Entropy can further be expressed as bits per second if information source is producing symbols at constant rate. Entropy in that case would be nothing but information transmission rate or simply "capacity" !!

How does it relate to earlier definition of capacity mentioned in 1st article:

C = Lim T → ∞ log N(T)/T

Let us take an example to see if we get same result from Entropy and above definition.

Say information source is producing symbols at constant rate of s symbols per second. There are N symbols and all have equal probability of getting produced.

Since symbols are equally probable, each need to be represented with same number of bits, i.e. log2N. Above definition gives us C as (s log2N) bits per second (refer 1st article).

Considering entropy equation:

pi = 1/N

So H = — K pi log pi = — K (N) (1/N) log(1/N) = — K log(1/N) = K logN

H = (K/log102) log2N

As K is undetermined constant and s is constant too, K can be adjusted to be ( s log102 ).

This makes H = (s log2N) bits per second.

So it seems if information source has entropy of H bits per symbols, it can be said that it has (or require) capacity of H bits per second. We can as well say that if a communication channel which carries information produced by a source of entropy H bits per symbol is capable of carrying (i.e. has a capacity of) H bits per second.

References: A Mathematical Theory of Communication by Claude E. Shannon, An Introduction to Information Theory by John R. Pierce.

Copyright © Samir Amberkar 2010-11§

Uncertainty and Entropy « Theory Index » Information rate with noise