Get started !
online LTE test
online C test

Updated or New
Site traffic (Aug-15)
GMSK modulation
Case of frequency correction burst
Solving modulation equations
One bit change
Normal duration burst



About
Feedback
Information Theory
Modulation
Multiple Access
OSI Model
Data Link layer
SS7
Word about ATM
GSM
GPRS
UMTS
WiMAX
LTE
Standard Reference
Reference books
Resources on Web
Miscellaneous
Mind Map
Perl resources
Magic MSC tool
Bar graph tool
C programming
ASCII table
Project Management
Simple Google box
HTML characters
Site traffic
Page view counter
5 6 9 , 6 5 9
another knowledge site

Capacity and Entropy

Capacity and Entropy [Under Information theory > Shannon's paper]

In 3rd article, we took Entropy as bits per symbol. Entropy can further be expressed as bits per second if information source is producing symbols at constant rate. Entropy in that case would be nothing but information transmission rate or simply "capacity" !!

How does it relate to earlier definition of capacity mentioned in 1st article:

C = Lim T → ∞ log N(T)/T

Let us take an example to see if we get same result from Entropy and above definition.

Say information source is producing symbols at constant rate of s symbols per second. There are N symbols and all have equal probability of getting produced.

Since symbols are equally probable, each need to be represented with same number of bits, i.e. log2N. Above definition gives us C as (s log2N) bits per second (refer 1st article).

Considering entropy equation:

pi = 1/N

So H = — K pi log pi = — K (N) (1/N) log(1/N) = — K log(1/N) = K logN

H = (K/log102) log2N

As K is undetermined constant and s is constant too, K can be adjusted to be ( s log102 ).

This makes H = (s log2N) bits per second.

So it seems if information source has entropy of H bits per symbols, it can be said that it has (or require) capacity of H bits per second. We can as well say that if a communication channel which carries information produced by a source of entropy H bits per symbol is capable of carrying (i.e. has a capacity of) H bits per second.

References: A Mathematical Theory of Communication by Claude E. Shannon, An Introduction to Information Theory by John R. Pierce.

Copyright © Samir Amberkar 2010-11§

Uncertainty and Entropy « Theory Index » Information rate with noise