Two important parameters of any wireless system are its spectral efficiency – how well it can utilize the available spectrum, and its range – how far it can reach. Thus we often read statements such as “WiMAX offers much higher spectral efficiency than WCDMA.” or “WiMAX has a much greater range than WiFi”. An important fact which is rarely discussed in marketing material and the so-called “technical white papers” is the fact that there is a tradeoff between spectral efficiency and range. Higher spectral efficiency requires a higher signal power at the receiver, which can be obtained only at a shorter distance.

To illustrate this tradeoff consider the following figure which shows the maximum range for systems operating with different spectral efficiencies, but with otherwise identical parameters (transmit powers, antenna gains, etc.). For the technically inclined, this graph was calculated for a 900 MHz system transmitting 1 Watt, with a combined transmit – receive antenna gain of 7 dB, a receiver noise figure of 5 dB, base station antenna height of 10 meters, and mobile antenna height of 2 meters. This graph is for a single cell with no interference and no fade margin (therefore the range values are very optimistic). The required SNR for a given spectral efficiency was computed using Shannon’s capacity formula, and the pathloss was computed using the small city Hata model.

Examining this curve shows the dependence of the maximum range on the spectral efficiency of the communication system. We see, for example, that a system with spectral efficiency of 4 has half the range of a system with unit spectral efficiency. This raises the question: what is more important – increasing the range or increasing the spectral efficiency? The answer, as is often the case in wireless issues, is “it depends”.

Let us say that you want to provide wireless coverage of a given area, and you have been assigned a slice of spectrum of say, 5 MHz. The area has a certain user density, i.e. the number of users per square mile. Let us talk for the moment about the density of active users, i.e. users who are actually using the wireless service simultaneously. This can be translated into an equivalent number of subscribers by using the fraction of subscribers which are active at any given time. Assume that each active user has been promised a certain average data rate, call it the “nominal rate”. Then the total data rate that needs to be provided by your service per square mile is the user density multiplied by the nominal rate. The actual data rate that *can* be provided by the system is its spectral efficiency multiplied by the available spectrum. Therefore the size of area that *could* be served by this system (assuming it can reach that far) is given by

Area = (spectral efficiency times available spectrum) / (user density times nominal rate)

As an example consider a system with spectral efficiency of 1 and 5 MHz spectrum, so the numerator of this equation equals 5 MBPS. The denominator depends on the user density and the nominal data rate per user. Let us say we are at an early stage of deploying our network and we have a very low user density, say 1 active user per square mile. Assume a nominal rate of 100KBPS. Then the are we could cover is 50 square miles. We would therefore like the range of our system to be as large as possible so that we can provide service with the smallest number of basestations. If the range of our system is, say 2 miles, we can only cover a 12 square mile area with a single basestation, which is 1/4 of the area we could have covered if we had larger range.

As time goes by, assuming our network attracts customers, the user density will increase. Also, as time goes by more bandwidth hungry wireless devices become available so the nominal rate will increase. Let us say that now we have 20 active users per square mile each requiring 250KBPS. Now the size of area over which we can provide service is only 1 squared mile. So if earlier we had one basestation every 12 squared miles, now we need one per every squared mile. We have to increase the number of deployed basestations by a factor of 12! What could be done to improve the situation? Note that our system now requires a range of only 0.56 miles (to cover 1 squared mile). So if we could increase the spectral efficiency (and pay with decreased range) that would be good. If for example we used a system with spectral efficiency of 4, we could now cover 4 squared miles instead of just 1.

So what can we conclude from this discussion? The range of a basestation in a cellular system is most important when the user density and the data rate required per user are low. As the user density and the required data rate increase, range becomes less important and spectral efficiency becomes more important. If you “plan for success” you need a system with the highest possible spectral efficiency, so you can service the largest number of users with a given spectrum. That is why cellular systems keep increasing their spectral efficiencies as we move from 2G to 3G to 4G.

Unfortunately spectral efficiency has some practical limits and can not be made very high. Why that is so requires a separate discussion. Once you have the highest practical spectral efficiency the only way to continue providing the required service for more and more users is to increase the density of the basestations (or to buy more spectrum). That fact has some important consequences which will be discussed in a future post tentatively titled “The dark secret of mobile broadband wireless”.