Skip to Main Content
Digital Revolution (II) - Compression Codes and Technologies

Information and compression

Feature Column Archive


2. Information and compression

The roots of compression lie with the work of the mathematician Claude Shannon (below).

picture of Claude Shannon
Claude Shannon

In his now famous paper "The Mathematical Theory of Communication," Shannon investigated the mathematics of how information is sent from one location to another as well as how information is transformed from one format to another . For example, in conveying the information in the music of an orchestra to your ear, you need to convert sounds to electronic signals (using microphones) and then store the information on a tape, long playing record, or CD before retransforming it so that you can listen to it. Shannon's work involved using the concept of a communications channel as a mathematical model or representation of how information gets sent from one location to another. Shannon showed that codes could be designed to correct errors made in sending information over a communications channel. He showed that error-correction codes had to exist and later Richard Hamming (below) produced examples of such codes.

picture of Richard Hamming
Richard Hamming

Shannon showed that it was also possible to compress information. He produced examples of such codes which are now known as Shannon-Fano codes. Robert Fano (below) was an electrical engineer at MIT (the son of G. Fano, the Italian mathematician who pioneered the development of finite geometries and for whom the Fano Plane is named).

picture of Robert Fano
Robert Fano

The Shannon-Fano codes were soon to be overshadowed by the work of David Huffman, a student of Fano's at MIT.



  1. Introduction
  2. Information and compression
  3. Encoding and decoding
  4. Lossy and non-lossy compression
  5. Huffman codes
  6. Compression methods and their applications
  7. Compression and intellectual property
  8. References