Implications [Under Information
theory > Shannon's paper > Capacity Theorem]
Capacity theorem (theorem 17, page 43) states that the capacity of a noisy channel having average power limitations is:
We will quickly look at the implications of this equation.
First, capacity is directly proportional to bandwidth. This seems obvious since more bandwidth mean more frequency components or more variations to use from, improving capacity directly. But better argument can be given with the help of well known Fourier theorem - every periodic waveform can represented by sum of sine waveforms. So whichever modulation and multiple access technique we use, a waveform can be represented by sum of frequencies. More frequencies mean more information streams and so more capacity (theoretically speaking).
Second, capacity is not directly proportional to signal-to-noise ratio, but rather proportional to logarithm of signal-to-noise ratio. If we increase (average) signal power, we do not get back directly proportional gain, but rather we get gain proportional to log of signal-to-noise ratio. This is result of entropy laws and Gaussian nature of (white thermal) noise as seen in earlier article.
So it seems preferable to go for more bandwidth rather than go for higher signal power. Bandwidth limitations arise from capabilities of communication medium. In wired communications (especially dedicated ones), usage of more bandwidth is best way for better capacity (limited by cost/logistic/future scope considerations). In wireless communications, bandwidth is very precious as air medium is highly shared and open. So in wireless communications, effort is also made to improve signal-to-noise ratio at receiver side. At the same time, attention is given to efficient use of available bandwidth. One good example is use of cells in mobile communications wherein area is divided in cells and frequencies are reused in number of cells. Check out LTE article for latest discussions on this topic.
Though there are lot of variables involved when we actually implement communication techniques, Shannon's equation does give us a very good measure to work with.
This conclude our discussion on Shannon's paper.
References: A Mathematical Theory of Communication by Claude E. Shannon, An Introduction to Information Theory by John R. Pierce.
Copyright © Samir Amberkar 2010-11 | § |
Couple of more points « | Theory Index | . |