One of the remarkable events that happened in the communications field was that relatively early in the development of the field (around 1947), Claude Shannon has derived fundamental limits on reliable communications in the presence of noise. The Shannon limit is very well known and is used extensively in the study of the performance of communications systems. By comparing the performance of a particular system to the best possible performance specified by the bound, we get an idea of how much room for improvement is still possible. Modern communication systems perform extremely close to the Shannon limit.

The Shannon limit in its simplest form states that the channel capacity C, meaning the theoretical maximum rate of essentially error-free data that can be sent with a given average signal power S through an analog communication channel is subject to additive white Gaussian noise of power N, is: C = B log2(1+S/N) where: C is the channel capacity in bits per second; B is the bandwidth of the channel in hertz; S is the total signal power over the bandwidth, measured in watts; N is the total noise power over the bandwidth, measured in watts; S/N is the signal-to-noise ratio (SNR) or the carrier-to-noise ratio (CNR) of the communication signal to the Gaussian noise

It should be noted that this bound applies to a fixed point-to-point communication channel, i.e. a situation where one transmitter ommunicates to one receiver and the propagation loss has a fixed value over time and frequency. More sophisticated versions of the bound have been developed for situations involving fading channels (where the propagation loss varies with time), for frequency selective channels (where the propagation loss varies with frequency), or for systems involving multiple receivers and transmitters.

It is important to understand that this bound is fundamental in the sense that as long as the underlying assumptions hold (e.g. fixed path loss, point-to-point communication, etc.), it is not possible to design a communication system which will violate the bound. If we specify the bandwidth B and the SNR, the data rate must be smaller than B log2(1+S/N). If we specify a bandwidth B and a data rate R, the SNR at the receiver must be greater than 2(R/B)-1. It does not matter how clever you are or how sophisticated your modulation and coding scheme are, the data rate, SNR and bandwidth, must obey this limit. If someone claims that they developed a new modulation/coding technique which will transmit 100 MBPS over a channel with bandwidth B=20 MHz, with SNR = 1 at the receiver, you note the fact that 100 > 20 log2(1+1) = 20, (i.e. the claimed data rate is 5 times larger than the bound), and you can immediately conclude that this claim is not feasible, without having to inquire about how their system works. This is analogous to the situation where someone claims that they built a perpetual motion machine – you can safely discount their claim without having to spend the time and effort required to study how their system works.

The Shannon limit is not the only limitation that comes up in wireless systems, although it is the best known and probably the most important one. We will discuss some other bounds in later postings.

This notion that there are fundamental limits on what can be done, is an uncomfortable one. We would like to believe, and many people in fact do believe, that anything is possible, that there is always hope. Accepting the notion that there are “hard limits” on what can be done no matter how creative or smart we are, is disturbing and incomprehensible to many people. Yet it is nonetheless true. Nobody has successfully built a perpetual motion machine or a communication system which does better than the Shannon limit. But that does not stop people from trying …