Lest the gentle reader think that incredible claims are confined to the world of wireless, let me assure you that this is not the case. Attempts to do the impossible occurred in many areas of science and engineering. To give but one example, the field of data compression has been a fertile ground for incredible compression schemes. In particular, the anticipated financial rewards for someone who develops a method for transmitting full rate video over phone lines, attracted a number of interesting players such as Pixelon, Media World Communication and the Adams platform, and Madison Priest and his magic box. The stories of incredible compression techniques makes for some fascinating reading. Here again the inventors blithly ignored the fundamental limits on compression specified by Shannon, our friend from the previous postings. The Shannon entropy equation provides a way to estimate the average minimum number of bits needed to encode a string of symbols, based on the frequency of the symbols. In particular, a string of independent random symbols with equal frequencies of occurance can not be compressed without loss of information. This well established facts does not seem to dampen the enthusiasm and creativity of the compression schemers who promise to losslessly compress incompressible data into a small fraction of their size. Significant amounts of money have been lost by investors in these enterprises.

Search It!

Recent Entries

Links
There is an especially simple and intuitive proof that random data cannot always be compressed, called the “counting argument”. Despite this, some people *still* claim (and some investors *still* believe in) magical algorithms that can compress anything, presumably even its own output. So with enough iterations it should be possible to compress anything down to one very, very random bit!