Get started !
online LTE test
online C test

Updated or New
5G NR Data Rate calculator New
5G NR TBS calculator New
5G NR UE Registration New
Python programming
C++ programming
MIMO for 3GPP Developer - 2
Uplink power control
MIMO for 3GPP Developer
NR ARFCN and GSCN
5GS Interfaces



About
Feedback
Information Theory
Modulation
Multiple Access
DSP (wip)
OSI Model
Data Link layer
SS7
Word about ATM
GSM
GPRS
UMTS
WiMAX
LTE
CV2X
5G
Standard Reference
Reference books
Resources on Web
Miscellaneous
Mind Map
Magic MSC tool
Bar graph tool
C programming
C++ programming
Perl resources
Python programming
Javascript/HTML
MATLAB
ASCII table
Project Management

another knowledge site

3GPP Modem
Simulator


Sparkle At Office comic strip

Rate in terms of noise entropy

Rate in terms of noise entropy [Under Information theory > Shannon's paper]

Let us put down our findings from article on Information rate with noise and try to combine them.

1) R = H(x)Hy(x)

2) H(x,y)= H(x) + Hy(x)

3) H(x,n) = H(x) + H(n)

Here destination (y) symbols are produced by impact of noise (n) symbols on source (x) symbols. This mean each y symbol correspond to combination of certain x symbol and n symbol. So number of combinations of x and y are same as number of combinations of x and n. In other words, average number of bits per symbol required to represent combined x and n signals is same as average number of bits per symbols required to represent combined x and y signals.

That is: 4) H(x,y)= H(x,n)

Combining above equations, we get R = H(y)H(n). This theorem 16 (page 43) from Shannon's paper.

Can we directly try to come out with above equation ? Consider following diagram:

Each Sx is combined with each Sn (based on probabilities) to give us Sy. As x and n are independent, whatever are the individual probabilities (uncertainities or entropies) of x and n, the diagram hold true. If n is not present or it has zero entropy (uncertainty), each y symbol will map to each x symbol. Due to presence of n's symbols (non-zero entropy), more number of bits are needed for y as shown in above diagram. If you examine the diagram and try b = 1,2,4,...., it will be evident that to get actual bits per symbol, we must deduct noise introduced n's bits per symbol from y's observed bits per symbol.

That is: R = H(y)H(n)

References: A Mathematical Theory of Communication by Claude E. Shannon, An Introduction to Information Theory by John R. Pierce.

Copyright © Samir Amberkar 2010-11§

Combined entropy « Theory Index » Continuous symbols