Get started !
online LTE test
online C test

Updated or New
5G NR Data Rate calculator New
5G NR TBS calculator New
5G NR UE Registration New
Python programming
C++ programming
MIMO for 3GPP Developer - 2
Uplink power control
MIMO for 3GPP Developer
NR ARFCN and GSCN
5GS Interfaces



About
Feedback
Information Theory
Modulation
Multiple Access
OSI Model
Data Link layer
SS7
Word about ATM
GSM
GPRS
UMTS
WiMAX
LTE
CV2X
5G
Standard Reference
Reference books
Resources on Web
Miscellaneous
Mind Map
Magic MSC tool
Bar graph tool
C programming
C++ programming
Perl resources
Python programming
Javascript/HTML
MATLAB
ASCII table
Project Management

another knowledge site

3GPP Modem
Simulator


Sparkle At Office comic strip

Information rate with noise

Information rate with noise [Under Information theory > Shannon's paper]

In earlier articles, we mainly covered transmission rate mainly from sender or information source point of view. In a way, we assumed error-free or perfect transmission medium i.e. signals sent by sender are received by receiver with no error. Can we come out with "effective" transmission rate if there is a noise in channel which result into errored transmissions ? We can assume that noise impacts (errors introduced) are independent of signals sent. This is a very important assumption and safe one in that.

When we consider such scenario, we realise that when signal is received at receiver, there may be an error with signal. This is an additional "uncertainty" that comes into play (to existing uncertainty of information source). Something like this: When a signal is received, what is probability that there is no error ?

To correct the errors in transmission, we will have to put in few more (redundant) bits in addition to certain number of bits per symbol that we transmit.

Clubbing above two points we can say that effective transmission rate (or entropy) would be:

R = H(x)Hy(x)

x denote source and y denote destination.

H(x) contain additional information Hy(x) for correction purposes and this need to be considered. Hy(x) is "conditional entropy" (Shannon calls it "equivocation").

Above euqation reduces to R = H(x) if there is no noise as we expect it to.

Let us consider combined entropy i.e. H(x,y). What can we say about it ? Can we say H(x,y)= H(x) + H(y) ?

Answer is No; because uncertainty in reception is not independent of what is being transmitted. Rather we need to say:

H(x,y)= H(x) + Hy(x)

Similar logic in other direction (from receiver's direction) gives us H(x,y)= H(y) + Hx(y).

In above calculation, we took noise as independent of signals sent. So we can even say that if there is such thing called combined entropy of transmitted signals x and noise n, it would be:

H(x,n) = H(x) + H(n)

In next article, we will try to combine these observations and see where it leads us.

References: A Mathematical Theory of Communication by Claude E. Shannon, An Introduction to Information Theory by John R. Pierce.

Copyright © Samir Amberkar 2010-11§

Capacity and Entropy « Theory Index » Combined entropy